Export Controls on Encryption Software
by Ira S. Rubinstein and Michael Hintze, Coping With US Export Controls 2000 (12/2000)

This article was originally published in the Coping with U.S. Export Controls 2000, (Practising Law Institute, Commercial Law and Practice Course Handbook Series, December 2000) (Cite as: 812 PLI/Comm 505). This is a revised and updated version of a paper presented at the COPING WITH U.S. EXPORT CONTROLS 1998 program. Copyright (c) 2000, Practising Law Institute, Ira S. Rubinstein & Michael Hintze.


Cryptography--the process of scrambling data to hide its content--is a critical technology for protecting valuable or sensitive information from unauthorized disclosure. Historically, national security concerns have dominated national cryptography policy due to the government's need for secure communications and intelligence gathering. In support of these vital interests, the Departments of State and Commerce, with assistance from the National Security Agency (NSA), play leading roles in the review and approval of cryptographic products for export.

Over the last decade, cryptography has moved from the highly secretive world of intelligence gathering into the mainstream of computer technology and electronic commerce. With ever increasing use of computer networks and electronic communications for the conduct of industry, government and personal affairs, businesses and consumers have come to rely on encryption to secure electronic fund transfers, protect proprietary and other sensitive information, and ensure the privacy and security of corporate databases, email, and other electronic records and transmissions (including wireless communications). With the growth of the Internet and various on-line services as well as government and industry plans to develop a framework for global electronic commerce, the international reliance on encryption for protecting digital information continues to expand rapidly.

To meet the worldwide commercial demand for data security products, many U.S. software vendors initially added encryption functions to popular messaging and network programs such as Lotus Notes, Novell Netware, and Microsoft Windows NT. Then, with the rise of the Internet, vendors began adding support for security protocols to popular Internet products such as Netscape Navigator and Microsoft Internet Explorer. Today, nearly all widely used software programs contain some encryption capabilities.

Although cryptography is studied at universities around the world, and encryption products have been readily available from foreign vendors for years, the U.S. government maintained strict controls on the export of encryption products for national security and law enforcement reasons. The computer industry has long argued that restricting exports of mass-market software programs with encryption capabilities is ineffective and harms U.S. competitiveness in worldwide markets. But successive Administrations have largely rejected these arguments. Incremental changes to the U.S. export controls over the last several years initially sought to encourage market acceptance of products that allow for government access to encryption keys or the plain text of encrypted data by relaxing export controls on such products. Only in the last two years has the U.S. policy moved toward allowing the broad exportability of most encryption products, but complex rules surrounding such exports remain.

This article provides a comprehensive review of export controls on encryption software. It examines the current state of Commerce Department controls on encryption software and technology, including the October 19, 2000 update to the regulations. [FN3] It also looks at a number of selected policy issues including two federal court cases challenging the constitutionality of encryption export controls, U.S. government policy regarding source code, posting encryption software to the Internet, and "crypto with a hole."


A short primer on cryptographic terms and concepts will help set the stage for the analysis of export licensing and related issues. [FN4]

§ 2(a) Background

Since ancient times, military leaders, diplomats, and spies have used cryptography to hide the contents of a message from adversaries. For example, Julius Caesar used a form of "secret writing" in which one occurring three letters later in the alphabet replaced every letter in a word. Thus, "ATTACK AT DAWN" becomes "DWWDFJ DN GDZQ." To unscramble this message, one reverses the encryption scheme by moving each letter three places to the left in the alphabet. Scrambling the message is called encryption. Descrambling it is called decryption. A cryptographic algorithm (or cipher) is the mathematical function used for encryption and decryption. Caesar's cipher uses a simple substitution function based on the number three. Modern encryption algorithms use more complex functions together with a key--a long string of bits (ones and zeros)--to secure messages against extremely powerful computer-based attacks.

Encryption allows users to send confidential messages. This is also called data encryption. In addition, there are various encryption protocols to ensure that a message is really from the sender and not an impostor (authentication) and has not been modified during transmission or storage (integrity). Data encryption has been compared to an envelope, because it prevents eavesdroppers from discovering the contents of a message. Authentication and integrity more closely resemble a signature on a letter, proving that the letter originated with the sender and has not been altered since it was signed. These latter two functions are frequently referred to as "digital signatures."

§ 2(b) Secret-Key and Public-Key Encryption

There are two main types of modern cryptographic systems: secret-key systems and public-key systems. In a secret-key system, the encryption and decryption keys are the same. For example, in Caesar's cipher, the key depends on moving each letter three places in the alphabet: three places to the right to encrypt, and three places to the left to decrypt. Secret-key systems therefore require the sender and the receiver to share the same key, which both parties must keep secret because anyone else who discovers the key can read any message encrypted with this key. [FN5] Furthermore, secret-key encryption requires a trusted method for distributing keys. This may work in a military setting but it is impractical in modern communication systems, where there typically is no secure channel for exchanging secret keys among millions of potential users.

In the mid-1970s, two Stanford University scientists (Whitfield Diffie and Martin Hellman) solved this problem by inventing a new approach to cryptography known as public-key encryption. Diffie and Hellman developed a system with two mathematically related keys. Although the keys form a matched pair, it is computationally infeasible to derive one key from the other. Therefore, the system allows users to openly publish one key in a phone book like directory (the "public key"), while keeping the other key private (the "private key"). The mathematical relationship between the two keys is such that a message encrypted with one key can be decrypted by the other (and vice versa). In other words, each key is the inverse function of the other; what one does, only the other can undo.

How does a public-key system avoid the need to share secret keys? In one popular public-key system developed at MIT, [FN6] it works as follows: Imagine that Alice wants to send Bob an encrypted email message. She looks up Bob's public key in a public-key directory. To send Bob a private message, she scrambles her email with Bob's public key. Bob then decrypts Alice's message using his private key. The result: as long as Bob keeps his private key private, no one else can read Alice's message except him.

More commonly, public-key encryption is used for "key exchange." Because secret-key encryption encrypts and decrypts data more quickly than a public-key system of comparable strength, secret-key encryption is more desirable, especially when large amounts of data must be encrypted and decrypted. But as discussed above, secret-key encryption is not useful unless there is a secure way to exchange secret keys. Public-key encryption provides the needed security for key exchange. Thus, public-key encryption can be used at the beginning of a communication session to exchange a secret key, and the remainder of the session can be conducted using the more efficient secret-key encryption.

Public-key encryption is also used for "digital signatures." In the above example, Alice can digitally sign her message to Bob using public-key encryption. She does so by encrypting the message with her own private key. Bob receives the message and looks up Alice's public key in the directory. If Bob can decrypt the message using Alice's public key, this confirms the mathematical relationship between the two keys. If Alice's public key does not decrypt the message, Bob knows that the message purportedly from Alice was not signed with Alice's key or that someone altered the message during transmission.

Thus, public-key encryption and digital signatures take place without any sharing of private keys: Alice and Bob use only one another's public key or their own private keys. Because public key encryption permits secure communications among parties with no prior trust relationship, it enables many exciting new technologies such as secure electronic commerce and digital cash.

§ 2(c) Cryptographic Key Lengths

Key lengths in modern secret-key encryption algorithms typically range in size from40 to 128 bits. The approximate difficulty of breaking an encrypted message by "guessing" the right key is proportional to the number of possible key values. If the key is 8 bits long, there are 28=256 possible keys. Therefore, it will take up to 256 attempts to find the correct key, with an expected number of attempts of 128. If the key is 40-bits long, the total number of keys is very large. On the other hand, modern computers are extremely fast at searching for all possible keys and even faster when they are networked together. For example, Schneier estimates that a network of 400 computers with fast commercially available chips, each capable of performing 32,000 encryption's per second, can complete a "brute force" attack against a 40-bit key in a single day. By comparison a 56-bit key provides 65,636 times as many possible key values as a 40-bit key. [FN7]

Because the length of the cryptographic keys used by a product was, for many years, the principle factor in determining the exportability of the product, key length has been a very controversial matter. In January 1996, at the request of the Business Software Alliance, an ad hoc panel of seven cryptographers and computer scientists published their findings on the question of the minimum key length required to provide adequate security against exhaustive key searches in commercial encryption applications. [FN8] They concluded that:

Cryptosystems with 40-bit keys are inadequate to protect against brute- force attacks and that even DES (56 bits) is increasingly inadequate;

Adequate protection against well-funded attacks using more sophisticated equipment or techniques requires keys at least 75 bits long; and

Given the expected advances in computing power, adequate protection for the next 20 years would requires keys at least 90 bits long.

There have also been many attempts to demonstrate the limits of particular key lengths by actually conducting brute force attacks. In 1995, a French graduate student reported breaking 40-bit RC4 in eight days by a brute force, known plain text attack, using a network of about 120 workstations and two supercomputers. [FN9] Since then a Berkeley graduate student cracked 40-bit RC4 in 4 hours, [FN10] a European team based in Switzerland cracked a 48-bit encryption code in less than two weeks, [FN11] and a programmer won a $10,000 prize for cracking 56-bit DES in approximately four months. [FN12] More recently, in July 1998, the Electronic Frontier Foundation (EFF) announced that it had built for less than $250,000 the first unclassified hardware for cracking messages encoded with DES, thereby winning the RSA Laboratory's "DES Challenge II" contest and a $10,000 cash prize. This machine took less than 3 days to complete the challenge. [FN13] These and similar efforts have seriously undermined confidence in 40 and 56-bit key lengths and have raised the bar for what the worldwide market demands. As a result, 128 bits has become the minimum key length demanded by consumers.

§ 2(d) Cryptographic Algorithms (DES, Diffie-Hellman, RSA, RC2 and RC4)

The two most popular cryptographic algorithms in commercial use today are DES and RSA. IBM developed DES (Data Encryption Standard) in the mid-1970s (with assistance from NSA) for use by the federal government to protect "unclassified but sensitive" information. Banks and other financial institutions have since standardized on DES for data transmissions related to electronic funds transfers and other financial information. DES is a secret-key algorithm with a fixed key length of 56 bits. However, there is also a much stronger implementation of DES, known as "triple DES" or 3DES, in which the output is encrypted twice more, thereby encrypting data with a total of three distinct 56-bit keys.

The details of the DES algorithm are freely available from the National Institute of Standards and Technology (NIST), [FN14] and DES hardware and software has been readily available abroad for many years. [FN15] Despite the wide availability of DES overseas, prior to December 1998, the U.S. Government tightly restricted exports of mass market software products using DES for data encryption.

RC2 and RC4 are secret-key encryption algorithms designed by Ron Rivest for data encryption. They are "variable-key-size" ciphers, which means that developers can implement them with short or long keys. When used with longer keys, they are considered alternatives to DES or 3DES. [FN16]

Diffie-Hellman is a commonly used public-key algorithm for key exchange. It was patented in the United States (but the patent expired April 29, 1997). RSA, named after the three MIT scientists who invented it--Rivest-Shamir-Adelman, is the most commonly used public-key algorithm. Unlike Diffie-Hellman, RSA can be used both for encryption and signing. It was patented in the U.S. (the patent expired September 20, 2000) but has been freely available abroad for years. Microsoft, Apple, Sun, Novell, Lotus, Netscape, IBM, Sun, Hewlett Packard, Motorola, AT&T and other leading computer and communications companies utilize Diffie-Hellman or RSA (or both) in commercially available software products. Although the Administration allows exports of Diffie-Hellman and RSA for authentication and integrity (digital signatures) regardless of key size, both are tightly controlled when used for data encryption/confidentiality (RSA) or key exchange (both).

§ 2(e) PGP

One of the most well known cryptographic software programs is Pretty Good Privacy (PGP). Phil Zimmermann, a software engineer from Boulder, Colorado, invented PGP in 1991 for the express purpose of providing strong cryptography at no cost to the general public. [FN17] PGP uses a 128-bit Swiss cipher (IDEA) for data encryption and Diffie-Hellman or RSA for key management. Schneier characterizes PGP as "well designed, nicely coded, and in the public domain. It's the closest you're likely to get to military grade encryption." [FN18]

Zimmermann is legendary in crypto circles for another reason: the Justice Department conducted a three-year long investigation of him for possible export violations after PGP was posted to a Usenet news group and distributed internationally. Zimmermann always maintained that he neither shipped PGP overseas nor posted it to Usenet. In January 1996, the Justice Department announced that it would not seek an indictment due to insufficient evidence that he had conspired with an unnamed individual who admitted posting the program. If a similar case arose today, the posting of PGP would have been perfectly legal due to recent liberalizations of the U.S. export rules that were implemented at least partially in response to the decision in the Bernstein case. [FN19]


§ 3(a) Background: Transfer of Jurisdiction from State to Commerce

Prior to December 30, 1996, the Office of Defense Trade Controls (DTC) of the State Department controlled virtually all encryption software and related technology as "defense articles" under Category XIII(b) of the U.S. Munitions List (USML), which is part of the of the International Traffic in Arms Regulations (ITAR). [FN20] The ITAR licensing regime imposed stringent controls on commercial encryption products, with a limited few exceptions. [FN21] A November 15, 1996 Executive Order, [FN22] implemented by December 30, 1996 regulations published by the Bureau of Export Administration (BXA), [FN23] transferred jurisdiction over "commercial encryption products" from the State Department to the Commerce Department -- thereby moving encryption items from the USML under the ITAR to the Commerce Control List (CCL) under the Export Administration Regulations (EAR). Encryption products specifically designed or modified for military use remain subject to ITAR controls.

The December 30, 1996 regulations defined a new class of national security and foreign policy controls known as "EI" controls and state that such controls apply only to "encryption items" transferred from the USML to the CCL consistent with the November 15 Executive Order. [FN24] All such encryption items--including encryption commodities controlled under ECCN 5A002, encryption software controlled under ECCN 5D002, and encryption technology controlled under ECCN 5E002--required a license for all destinations, except Canada. [FN25] EI controls did not apply to items on the USML and U.S. persons holding valid USML licenses issued by DTC prior to December 30, 1996 could continue shipping such items subject to the terms and conditions of the USML license. [FN26] Nor did EI controls apply to encryption items that were already subject to Commerce jurisdiction prior to December 30, 1996, such as items that were the subject of a prior commodity jurisdiction determination; or items covered by any of the exclusions set forth in ECCN 5A002; or any mass market software products released from EI controls as the result of a one-time review.

The December 30, 1996 regulations not only controlled commercial encryption products for EI reasons, they also took several additional steps to maintain largely the same licensing policies that DTC had applied to encryption software and technology under the ITAR. These included amending the EAR to ensure that provisions relating to public availability, [FN27] the de minimis rules, [FN28] foreign availability, [FN29] and the General Software Note, [FN30] are inapplicable to encryption items. In addition, the regulations prohibited the export of technical assistance (including training) to foreign persons with respect to encryption products [FN31] and define the "export" of encryption source code or object code to include making it available on-line for transfer outside the U.S., unless precautions are taken that prevent the unauthorized transfer of the code. [FN32] Indeed, it is hardly surprising, in view of these changes to the EAR, that some commentators dubbed the EI controls the "virtual ITAR." Many of the remnants of this "virtual ITAR" are a major contributing factor to the complexity of the regulations today.

Prior to the November 15th Executive Order, the Departments of State, Defense, and Energy, and the Arms Control and Disarmament Agency, each had the right to review any license application to BXA. [FN33] The November 15th Executive Order and the implementing regulations granted the Department of Justice (DOJ) authority to review encryption exports and added DOJ to the Export Administration Review Board as well as the Operating Committee of the Advisory Committee on Export Policy with respect to reviewing encryption products. [FN34] Despite this unprecedented role for domestic law enforcement in export control decisions, the NSA remains the lead technical agency for evaluating whether encryption items are subject to EI controls. Over the past few years, the agency has emerged from its shadowy past [FN35] and has been thrust into the public spotlight as a result of renewed legislative interest in encryption export policy [FN36] and two high profile lawsuits challenging the constitutionality of ITAR and EAR controls on encryption exports. [FN37]

Subsequent regulations have been issued that further amended the EAR with respect to encryption exports, each step liberalizing the approach adopted in the December 30, 1996 regulations. Today, almost all encryption items are exportable to almost any end-user worldwide. Nevertheless, the licensing analysis to determine under what provision of the regulations a particular item is exportable, and what hoops the exporter must jump through in order to accomplish the export, is still quite complex.

§ 3(b) The Four-Step Licensing Analysis

Larry Christensen, the former Director of Regulatory Policy at BXA, has long advised exporters to utilize a decision tree for determining the appropriate License Exception or license for the export of software products and related technology. [FN38] In determining the licensing requirements for encryption software and technology under the current Commerce regime, U.S. exporters should use a similar approach and follow the steps outlined below, in descending order:

(1) Determine whether the software or technology qualifies as publicly available and, therefore, is outside the scope of the EAR;

(2) If not, determine whether the software or technology is outside the scope of EI controls for any of the following reasons:

The items were automatically transferred from ITAR to EAR jurisdiction under prior rules and are excluded from EI controls due to their limited encryption functionality;
The items were already under Commerce controls due to a prior case-by- case jurisdictional transfer; or
The items were released from EI controls because they qualify for the 56/512-bit encryption or 64-bit mass market exceptions.

(3) If not, determine whether the software or technology qualifies for License Exception TSU, ENC or KMI:

License Exception TSU, which covers "unrestricted" encryption source code and publicly available object code complied from such source code.
License Exception ENC, which covers the export of:
Unlimited strength encryption products to any end-users in the EU+8 countries or to any non-government end-users elsewhere; or
"Retail" encryption items (of any strength) to all end-users (except, of course, to those in embargoed / terrorist countries).
License Exception KMI (Key Management Infrastructure), which covers the export of qualifying key escrow or key recovery products.

(4) If not, determine whether an Encryption Licensing Arrangement is available for a specific class of end-users (e.g., sales of "non-retail" encryption items to a group of government entities), or whether an individual license is available on a case-by-case basis.

These steps are discussed at length below.

§ 3(b)(1) Step 1: Determining Whether an Exemption for "Publicly Available" Software and Technology Applies

As a general rule, "publicly available" software and technology is outside the scope of the EAR. [FN39] This includes software and technology that is free from EAR licensing requirements because it is:

Published or will be published (for software, this means that it is available for general distribution either for free or at a price that does not exceed the cost of reproduction and distribution) (§ 734.7).
Arises during, or results from, certain "fundamental research" (§ 734.8).
Released through academic courses or labs as "educational information" (§ 734.9).
Included in certain patent applications (§ 734.10).

However, as noted previously, the December 30, 1996 regulations amended Part 734 of the EAR to reflect that encryption software controlled for EI reasons under ECCN 5D002 remains subject to the EAR, even when publicly available (hereinafter referred to as the "Public Domain EI-Software Carve-Out"). [FN40] This includes encryption source code in electronic form or media (e.g., a computer diskette or CD-ROM), but not such code in printed form. [FN41]

Although the Public Domain EI-Software Carve-Out preserved the State Department's position that encryption software is ineligible for public domain treatment, it represented a major change in BXA policy, which previously recognized that First Amendment protection applies to software placed into the public domain (on the Internet or otherwise). [FN42] In defending this distinction between encryption software and other software regulated under the EAR, the Administration has emphasized that the former is controlled because of its functional capacity, and not because of any informational value. [FN43] This distinction has been challenged in the courts on constitutional grounds. [FN44] Unless and until these cases are decided against the government, Step 1 is merely pro forma for EI-controlled encryption software because the public domain exemption simply does not apply to such software.

Additionally, the general exclusion from "publicly available" treatment applies only to EI controlled software. Thus, encryption technology under ECCN 5E002 can be eligible for publicly available treatment. ECCN 5E002 covers "Technology" according to the General Technology Note [FN45] for the "Development," "Production," or "Use" of equipment controlled by 5A002, 5B002 or software controlled by 5D002. The EAR defines "Technology" as "specific information necessary for the 'development', 'production', or 'use' of a product. The information takes the form of 'technical data' or 'technical assistance'." The definitions further state that "technical data" may take various forms such as "blueprints, plans, diagrams, models, formulae, tables, engineering designs and specifications, manuals and instructions written or recorded on other media" and that "technical assistance" may take forms such as "instruction, skills training, working knowledge, [or] consulting services." [FN46] Thus, ECCN 5E002 would seem to apply to all EI-controlled encryption technology.

Does this mean that the EAR permits the export of publicly available encryption technology? The correct answer is "yes" but as a practical matter this is true only to the extent that technology means "technical data" (such as printed material) EAR § 744.9(a) adds another general prohibition with respect to "technical assistance" in the foreign development or manufacture of encryption commodities and software that would be controlled for EI reasons under ECCN 5A002 or 5D002 if developed or manufactured in the United States. This provision represents BXA's effort to preserve the State Department restrictions on U.S. persons providing "defense services" [FN47] to foreign nationals--and this restriction seems to apply irrespective of whether the "technical assistance" is publicly available. In short, no license is needed to furnish books or other printed materials that are exempt from export controls for constitutional reasons (technical data) but assisting foreign persons with encryption development or manufacture may still be prohibited without a license (technical assistance).

§ 3(b)(2) Step 2: Determining Whether Encryption Software Is Not Subject to EI Controls

§ 3(b)(2)(i) Treatment of Software Formerly Covered by Category XIII(b)(1) Exemptions -- Exceptions to ECCNs 5A002 and 5D002

Prior to the transfer from the State Department to the Commerce Department of jurisdiction over encryption exports, ITAR controls on encryption did not apply to cryptographic equipment and software if their functionality was limited to any of the following nine categories:

(1) Decryption only for copy-protected software;

(2) Bank or money transactions;

(3) Cryptographic processing using analogue techniques in certain radio and fax equipment;

(4) Certain personalized smart cards;

(5) Access control devices such as ATMs;

(6) Data authentication;

(7) Fixed data compression or coding techniques;

(8) Set top decoders; and

(9) Anti-virus software. [FN48]

Rather, equipment and software performing these and no other functions were automatically transferred to EAR jurisdiction and classified under the Information Security entries in Category 5 of the CCL. All such cryptographic equipment and software qualified for License Exception treatment, making these items exportable to all but embargoed and terrorist supporting destinations.

The December 30, 1996 regulations preserved the Category XIII(b)(1) exceptions via notes to ECCN 5A002 and 5D002, which exclude from EI controls the encryption equipment and software performing the functions mentioned above. Rather, the items in question (assuming they have no other functions covered by a more restrictive ECCN) are controlled instead under ECCN 5A995 and 5D995, respectively, making them eligible for export with No License Required (using the designator "NLR") to all countries except for those having an "X" in the box under AT Column 2 to the Country Chart in Supp. 1 to Part 738. [FN49]

These exceptions are still reflected in the current regulations. Today, ECCN 5A002 and 5D002 exclude the following items:

(a) "Personalized smart cards" where the cryptographic capability is restricted for use in equipment or systems excluded from control paragraphs (b) through (f) of this note. Note that if a "personalized smart card" has multiple functions, the control status of each function is assessed individually;

(b) Receiving equipment for radio broadcast, pay television or similar restricted audience broadcast of the consumer type, without digital encryption except that exclusively used for sending the billing or program- related information back to the broadcast providers;

(c) Portable or mobile radiotelephones for civil use (e.g., for use with commercial civil cellular radio communications systems) that are not capable of end-to-end encryption;

(d) Equipment where the cryptographic capability is not user-accessible and which is specially designed and limited to allow any of the following:

execution of copy-protected "software";

access to any of the following: (a) copy-protected read-only media; or (b) information stored in encrypted form on media (e.g., in connection with the protection of intellectual property rights) where the media is offered for sale in identical sets to the public; or

one-time encryption of copyright protected audio/video data;
(e) Cryptographic equipment specially designed and limited for banking use or money transactions;

(f) Cordless telephone equipment not capable of end-to-end encryption where the maximum effective range of unboosted cordless operation (e.g., a single, unrelayed hop between terminal and home basestation) is less than 400 meters, according to the manufacturer's specifications.

Plus, ECCN 5A002 and 5D002 exclude cryptography used for authentication or digital signature functions. According to the current regulations, "authentication" includes all aspects of access control where there is no encryption of files or text except as directly related to the protection of passwords, Personal Identification Numbers (PINs) or similar data to prevent unauthorized access. Authentication and digital signature functions include their associated key management function.

Finally, the cryptography controlled under ECCN 5A002 and 5D002 does not include "fixed" data compression or coding techniques.

§ 3(b)(2)(ii) Exports of 56-bit and 64-bit Encryption Under 5A992/5D992 (NLR)

The January 14, 2000 regulations made certain encryption items with limited key lengths eligible for export to all destinations (except the embargoed / terrorist countries) under ECCNs 5A992 and 5D992 (NLR -- "no license required"). [FN50] Specifically, this provision includes any encryption items with a key length up to 56-bits and a key exchange mechanism not exceeding 512- bits. [FN51] Plus, it includes any product limited to key management functionality that uses a public-key algorithm with a key length not exceeding 512-bits. Finally, it includes any "mass market" encryption item with a key length up to 64-bits.

"Mass market" is defined as hardware and software that is available to the public via sales from stock at retail selling points, by means of over-the- counter transactions, mail order transactions, electronic transactions, or telephone call transactions. The items must be designed for installation by the user without further substantial support by the supplier. [FN52] Moreover, the cryptographic functionality cannot be easily modified by the user. [FN53]

In order for 56/512 bit or 64 bit encryption products to be eligible for NLR treatment under ECCNs 5A992 or 5D992, they must first undergo a one-time technical review in accordance with Supplement 6 to Part 742. 40 or 56-bit mass market encryption items that were reviewed prior to January 14, 2000 and classified as eligible for export under a license exception (TSU for software and ENC for hardware), can be increased to 64 bits and exported as a mass market product under 5A992/5D992 without an additional review. [FN54] However, items exported under this provision are not subject to reporting requirements.

This different treatment for shorter key lengths is a carryover from the criteria the DTC used for commodity jurisdiction ("CJ") transfers under the SPA agreement. [FN55] The December 30, 1996 regulations allowed 40-bit mass market products to be released from EI-controls after a one-time review and classification. The key length was subsequently raised to 56-bits; and with the January 14, 2000 regulations the eligible key lengths were raised to the 56 and 64 bit limits described above.

§ 3(b)(3) Step 3: Determine Whether the Product Qualifies for License Exception ENC or License Exception KMI

§ 3(b)(3)(i) License Exception TSU -- "Unrestricted" Source Code and Object Code

However, recent amendments to the EAR have had the effect of chipping away at the Public Domain EI-Software Carve-Out, [FN56] at least for a limited category of EI-controlled software. The January 14, 2000 amendments made "unrestricted encryption source code" eligible for release from EI controls and exportable under License Exception TSU. [FN57] And under the October 19, 2000 amendments, object code software compiled from "unrestricted encryption source code" has been made eligible for License Exception TSU. [FN58] In either case, the source code or object code must otherwise meet the EAR criteria of "publicly available." Additionally, the software must be made available for free: for source code this means that the code must not be "subject to an express agreement for the payment of a licensing fee or royalty for commercial production or sale of any product developed with the source code," [FN59] and for object code, there must be "no fee or payment (other than reasonable and customary fees for reproduction and distribution) ... required for the object code."

At least with respect to the unrestricted source code, there is a substantial conceptual inconsistency within the EAR. On the one hand, EAR § 740.13(e)(1) states that such source code "is released from EI controls" once the exporter has provided the URL or a copy of the source code to BXA. As we learned in § 3(b)(1) above, once software is released from EI controls, it is eligible for "publicly available" treatment -- and we already know that this "unrestricted" source code meets the "publicly available" criteria, since that was one of the requirements for release from EI controls. As we also learned, eligible software that meets the "publicly available" criteria is no longer subject to the EAR. Thus, by stating that "unrestricted" source code is released from EI controls, the regulations are in effect stating that this category of source code is no longer subject to the EAR. On the other hand, EAR § 740.13(e) contains other provisions that presume to impose ongoing restrictions on this source code, even though, in effect, it is no longer subject to these provisions. For example, EAR § 740.13(e)(3) states that an exporter may not knowingly export this source code to Cuba, Iran, Iraq, Libya, North Korea, Sudan or Syria. However, software that is not subject to the EAR can, for example, be exported to Syria since there are no other general prohibitions on exports to Syria outside of the EAR. [FN60] As a practical matter, however, there is little advantage in most cases to treating code as "not subject to the EAR" over the alternative of exporting it under License Exception TSU, so many exporters will likely decide not to "push the envelope."

§ 3(b)(3)(ii) License Exception ENC -- Retail Encryption Items

The January 14, 2000 regulations created the new category of "retail" encryption items. There are several ways in which a product can qualify for "retail" status. Retail encryption items consist of any product that meets at least one of the criteria of § 740.17(b)(3)(i) and all the criteria of § 740.17(b)(3)(ii). [FN61]

The regulations also provide a non-exhaustive list of items that meet the "retail" criteria, subject to the requirements of § § 740.17(b)(3)(i) and (ii). These include "general purpose operating systems and their associated user- interface client software or general purpose operating systems with embedded networking and server capabilities; non-programmable encryption chips and chips that are constrained by design for retail products; low-end routers, firewalls and networking or cable equipment designed for small office or home use; programmable database management systems and associated application servers; low-end servers and application-specific servers (including client-server applications, e.g., Secure Socket Layer (SSL)-based applications) that interface directly with the user; and encryption products distributed without charge or through free or anonymous downloads." [FN62]

Additionally, "retail" encryption also includes finance-specific encryption items. [FN63] This applies to products that are limited by design for use in financial transactions or communications. Such limitations may include encryption capabilities that are highly field-formatted with validation procedures (e.g., specifically delineated fields such as merchant's / customer's ID, goods, payment mechanism, etc.) and are not easily diverted to other uses.

Also included in the "retail" category are non-mass-market products with a key length up to 56 bits, and with key exchange greater than 512 and up to 1024 bits. [FN64] Note that, as discussed in § 3(b)(2)(ii) above, 56-bit non-mass market products with a key exchange mechanism limited to 512 bits are exportable under ECCNs 5A992 or 5D992 (NLR), as are any mass-market products with a key length up to 64-bits.

Another subcategory of "retail" encryption includes products in which the only encryption functionality is for short-range wireless communications. [FN65] Examples of such products include audio devices, cameras and videos, computer accessories, handheld devices, mobile phones and consumer appliances (e.g., refrigerators, microwaves, washing machines, etc.) that use the Bluetooth or HomeRF specifications.

Finally, any other encryption product that provides equivalent functionality to other products that have been classified as "retail" will also be treated as "retail." [FN66]

For the different subcategories of "retail" encryption items, there are different requirements regarding prior review and classification. Full review and classification, according to the guidelines of Supplement 6 to Part 742, is required in order to a product to qualify for retail status under the criteria of § § 740.17(b)(3)(i) and (ii), including the list of examples of "retail" product types that is included in § 740.17(b)(3)(iii). Presumably, a full review and classification would also be required for the provision that grants retail status to products that provide equivalent functionality to other products that have previously been classified as retail. 56/1024-bit products and finance specific products can be exported under License Exception ENC as "retail" products immediately after submitting a classification request to BXA (thereby making this more of a notification requirement, rather than forcing the exporter to wait for a formal classification from BXA). Finally, the "retail" subcategory for short-range wireless encryption permits the export of such products without any prior review or classification.

Practice Tip (de minimis eligibility): When applying for a "retail" classification for 5D002 software, include in the application a request that the software also be made eligible for "de minimis" treatment under EAR § 734.4. The EAR's de minimis rule states that reexports of foreign-made commodities, software or technology, that incorporate a de minimis level of controlled U.S.-origin commodities, software or technology, are not subject to the EAR. When the reexport is to a terrorist or embargoed destination, the de minimis level is 10% or less of the total value of the foreign-made commodity. For all other destinations, the de minimis level is 25%. The December 30, 1996 regulations excluded EI-controlled encryption items from de minimis eligibility. The October 19, 2000 regulations loosened this exclusion, however, by allowing exporters to request de minimis eligibility for 5D002 software exportable under the "retail" or "source code" provisions of License Exception ENC. [FN67] By including this request in the "retail" classification request (that will likely be submitted in any case), exporters can eliminate an additional step in seeking the broadest possible approval for a given product.

§ 3(b)(3)(iii) License Exception ENC -- Non-Retail Encryption Products

5A002 or 5D002 encryption items that are not classified as retail are still widely exportable under License Exception ENC.

For example, any encryption products (except cryptanalytic items) [FN68] can be exported under License Exception ENC to any end-user located in the European Union, plus eight additional countries, as listed in Supplement 3 to Part 740. [FN69] Moreover, such items can be exported to any office or subsidiary (except those located in an embargoed / terrorist destination), of a company, organization or government headquartered in Canada or one of the EU+8 countries. [FN70] Such exports are permitted immediately once the exporter has submitted a classification request (i.e. there is no need to wait for a response from BXA). [FN71]

Any encryption products (except cryptanalytic items) can also be exported to subsidiaries of U.S. companies [FN72] worldwide. Moreover, exports to U.S. subsidiaries can be carried out without prior review and classification by BXA. [FN73] One of the major benefits of this provision of License Exception ENC is that it frees U.S. companies from putting up internal barriers and allows them to freely transfer encryption technology and source code to their foreign facilities for uses such as product development. [FN74]

Finally, EI-controlled encryption items that have been submitted for review, and classified as non-retail can still be exported to non-government end-users anywhere except the embargoed / terrorist destinations. [FN75] If a retail classification has been submitted, and 30 days have passed since the receipt of the request without a response from BXA, the exporter can also export under ENC to non-government end-users. [FN76] As a practical matter, of course, this provision would only be used for non-government end-users outside of the EU-8 countries, since EI-controlled items can be exported to any user (government and non-government) inside the EU+8 immediately upon submission of a classification request.

While License Exception ENC permits the export of non-retail encryption items to any non-government end-user worldwide, there is one limitation on the use of such products by a certain class of end-users. Specifically, a BXA license is required for the use of any non-retail encryption item by any ISPs and telecommunication service providers to provide services specific to government end-users outside the EU+8. [FN77] Generally available services, where the consumers of that service may be both government and non-government users, would not require a license, since such services would not be "specific to government end-users."

§ 3(b)(3)(iv) License Exception KMI -- Key Escrow and Key Recovery Products

License Exception KMI encompasses what is now a rather obscure licensing sub- category. Specifically, it includes key escrow and key recovery products of unlimited key lengths that satisfy detailed technical criteria. Eligibility for License Exception KMI requires a one-time BXA review. Exporters must submit a classification request in accordance with the procedures set forth in EAR § 740.8(b)(2) demonstrating that the key escrow and key recovery products meet the criteria identified in Supplement No.4 to part 742 of the EAR. License Exception KMI also requires semi-annual reporting of the quantity shipped, the ultimate consignee, the country of ultimate destination, and, if available, the specific end-user name and address.

This License Exception is a remnant of earlier Administration policies that attempted to use export control relief to encourage the development and sale of key escrow and key recovery encryption products. Prior to the December 1998 regulations, License Exception KMI also required prior approval of key recovery agents and included very detailed requirements for such approval. Unfortunately, the Administration did not modify the technical criteria in Supplement 4. These criteria continue to reflect (with only minor revisions) the November and December 1995 NIST criteria for Software Key Escrow, which industry and privacy advocates rejected at that time and continue to criticize. [FN78]

While License Exception KMI for key-escrow or key-recovery products has never been widely used, its use today would be extremely rare. Given the wide exportability of non-recovery encryption items under the various provisions of License Exception ENC, there is little incentive to develop and market products that meet the strict requirements of Supplement 4 to Part 742. Assuming, however, that an exporter does have such a product, it would be much easier to export under License Exception ENC. The key-escrow or key-recovery product may be eligible for classification as a "retail" encryption item. If so, it could be exported to any user worldwide, except those in the embargoed / terrorist destinations (the same scope as License exception KMI), but the export(s) would be subject to the much more limited reporting requirements of License Exception ENC. Even if the product is not eligible for "retail" treatment, it could still be exported under ENC to any end-user in the EU+8 countries listed in Supplement 3 to Part 740, and to any non-government end- user elsewhere (embargoed / terrorist destinations). Thus, the only case in which License Exception KMI would arguably be useful is for the export of a non-retail key-escrow or key-recovery encryption item to a foreign government outside of the EU+8.

§ 3(b)(4) Step 4: Determine Whether the Product Qualifies for an Encryption Licensing Arrangement (ELA) or Individual License

For exports of EI-controlled encryption items that are not eligible for a license exception, an exporter can apply for an individual license or an Encryption Licensing Arrangement (ELA). The December 30, 1996 regulations introduced the ELA as a substitute mechanism for exporting strong encryption to broad classes of end-users, and those provisions can now be found in EAR § 742.15(b)(3). ELAs are available for all destinations except Cuba, Iran, Iraq, Libya, North Korea, Syria and Sudan. [FN79] ELAs are subject to case-by-case review and negotiation with the NSA and BXA (and perhaps Justice/FBI). Although ELAs can be more difficult to prepare than classification requests or applications for individual licenses, and may impose burdensome reporting requirements as conditions on the license (including information such as end- user name and address), they can allow more flexible treatment of strong crypto exports to groups of end-users not covered by license exceptions.

Examples of cases that might require an individual license or an ELA include: approving exports of non-retail encryption items to government end-users outside the EU+8 countries, [FN80] authorizing the use of a non-retail encryption item by an ISP to provide services specific to a government end-user outside the EU+8 countries, [FN81] approving certain exports or reexports to an embargoed / terrorist destination (individual license only, ELAs are not available for those destinations), [FN82] or authorizing under the deemed export rule certain activities by a foreign national of an embargoed / terrorist destination working for a U.S. company. [FN83]

§ 3(c) Other Licensing Issues

§ 3(c)(1) Source Code Exports

Recent changes to the EAR have created specific rules regarding the export of cryptographic source code. The regulations divide source code into three distinct categories: "unrestricted" encryption source code, publicly available commercial encryption source code, and non-publicly available commercial source code. Each category is subject to a very different set of rules.

Unrestricted encryption source code (e.g. open source) is defined as source code that would be considered publicly available under EAR § 734.3(b)(3), and is not subject to an express agreement for the payment of a licensing fee or royalty for commercial production or sale of any product developed with the source code. Moreover, the regulations clarify that intellectual property protection will not, by itself, be construed as an express agreement for payments of a licensing fee or royalty. [FN84] As discussed above in § 3(b)(3)(i), unrestricted encryption source code is exportable to any end-user worldwide under License Exception TSU immediately upon submission to BXA of written notification of the URL or a copy of the source code.

Publicly available commercial encryption source code is defined as source code that would be considered publicly available under EAR § 734.3(b)(3), and is subject to an express agreement for the payment of a licensing fee or royalty for commercial production or sale of any product developed with the source code. [FN85] As with unrestricted encryption source code, publicly available commercial encryption source code is exportable to any end-user worldwide immediately upon submission to BXA of written notification of the URL or a copy of the source code. However, this category of code is exportable under License Exception ENC instead of TSU. As a result, direct exports of commercial source code to foreign manufacturers, when intended for use in foreign products developed for commercial sale, are subject to reporting requirements. [FN86]

Finally, non-publicly available commercial encryption source code is defined as source code that would not be considered publicly available under EAR § 734.3(b)(3) and does not, when compiled, provide an "open cryptographic interface." [FN87] The scope of the eligible recipients is narrower than the other two categories of source code -- this code is exportable under License Exception ENC to any end-user in the EU+8 countries, [FN88] and any non- government end-user in the rest of the world. [FN89] This category of source code is also exportable immediately, but only after the submission of a completed classification request (as opposed to the written notification of URL or copy of the source code that is required for the other two categories of source code). [FN90] Non-publicly available commercial encryption source code is subject to the same reporting requirements as publicly available commercial encryption source code.

Foreign products that incorporate or are developed with U.S.-origin source code (of any of the three types) do not require any review or classification by BXA, and can be exported and reexported without any BXA authorization (although technically they remain subject to the EAR). [FN91]

The following chart summarizes the major differences in the treatment of the three categories of encryption source code:

  Unrestricted Source Code Commercial Source Code (publicly available) Commercial Source Code (non-publicly available)
Eligible End-User Any Any Any in EU+8, non-government elsewhere
License Exception TSU ENC ENC
Reporting No Yes (for direct sale to manufacturers) Yes (for direct sale to manufacturers)
Classification / Review Req. Export upon BXA notification Export upon BXA notification Export upon submission of classification
Compiled Object Code Exportable under TSU if publicly available Exportable under ENC if publicly available No special treatment
Internet Posting [FN92] No screening / Not an export No screening / Not an export Screening for gov't users outside the "EU+8" countries
Principle EAR Provision 740.13(e) 740.17(b)(4)(i) 740.17(b)(4)(ii) [FN93]

§ 3(c)(2) Posting EI-Controlled Software to the Internet is an Export Unless Certain Precautions are Taken

Prior to the publication of the December 30, 1996 regulations, neither the ITAR nor the EAR directly addressed the question of whether distributing software on-line constituted an export. DTC had developed informal guidelines that a few universities and vendors relied upon in establishing FTP or Web sites for "secure downloading" of software programs with strong crypto features. These sites generally utilized some combination of the following safeguards to ensure compliance with U.S. export law:

1. Blocking access to foreign users by reverse resolution of the user's IP address. If the address resolves to a foreign domain name, the server should disallow access to the product in question.

2. Requiring anyone who accesses the software to read an export notice (or export clause in an on-line license agreement) with information on who can or cannot legally download the software and what further precautions must be observed regarding re-export. Moreover, the site should require the user to agree to abide by these restrictions in order to proceed with download. Some vendors also screened all users against the government's "prohibited party" list. [FN94]

3. Limiting access to the crypto software by storing it in a hidden directory, which is made accessible only when a user agrees to the export agreement. Storing restricted software in a time-dependent directory is not strictly required provided the site utilizes some technique for restricting downloading privileges to identified users with U.S. IP addresses.

4. Maintaining an access log and routinely reviewing it for obvious hacks or penetrations, and reporting such incidents to the export authorities. [FN95]

The December 30, 1996 regulations changed all this by expanding the definition of "export" to include making EI controlled encryption source code and object code available for download to persons outside the U.S. or Canada (via bulletin board, FTP or Web sites), unless "adequate precautions" were taken to prevent unauthorized transfers. [FN96] The regulations allow operators of Web servers or other communication facilities to avoid liability by ensuring that the facility from which the software is available controls the access to, and transfers of, such software through the following measures:

A. The access control system, either through automated means or human intervention, checks the address of every system requesting or receiving a transfer and verifies that such systems are located within the United States;

B. The access control system provides every requesting or receiving party with notice that the transfer includes or would include cryptographic software subject to export controls under the Export Administration Act, and that anyone receiving such a transfer cannot export the software without a license; and

C. Every party requesting or receiving a transfer of such software must acknowledge affirmatively that he or she understands that the cryptographic software is subject to export controls under the Export Administration Act and that anyone receiving the transfer cannot export the software without a license. BXA will consider acknowledgments in electronic form provided that they are adequate to assure legal undertakings similar to written acknowledgments. [FN97]

The January 14, 2000 and October 19, 2000 amendments to the regulations narrowed the scope of this rule by exempting certain EI-controlled software from the need to employ such "adequate precautions" when posting this software to the web. Specifically, for both "unrestricted" source code and publicly- available commercial source code, as well as object code compiled from either type of source code, the regulations specifically state that posting the code on the Internet where it may be downloaded by anyone worldwide "would not establish 'knowledge' of a prohibited export or reexport. In addition, such posting would not trigger 'red flags' necessitating the affirmative duty to inquire under the 'Know Your Customer' guidance provided in Supplement No. 3 to part 732 of the EAR."

Additionally, while the text of the regulations specifically exempt only the two types of source code and the corresponding object code, the regulations remain ambiguous as to whether screening is required for making "retail" software available for download. The background information contained in the Federal Register notice with the January 14, 2000 regulations stated "unrestricted encryption source code under Sec. 740.13(e), commercial encryption source code under Sec. 740.17(a)(5)(i) and retail products under Sec. 740.17(a)(3) are exempted from Internet download screening requirements in Sec. 734.2(b)(9)(iii)." [FN98] Given that retail encryption products are exportable to virtually any end-user worldwide (except those in embargoed countries), the "adequate precautions" listed in § 734.2(b)(9)(iii) (e.g. checking that the requesting address is located in the U.S.), clearly do not seem applicable. If they did apply, this would mean that retail encryption products could not be made available over the Internet to users outside the U.S. at all. It is an open question, however, whether any kind of screening is required for retail encryption products.

§ 3(c)(3) The Deemed Export Rule

The "deemed export" rule states that the release of technology or source code to foreign nationals within the U.S. is deemed an export to the foreign national's home country. [FN99] For purposes of the deemed export rule, a "foreign national" includes an individual who enters the U.S. for a temporary stay in any of the alphabet soup of nonimmigrant visa categories (e.g., B, E, F, H, J or L), but does not include individuals who are protected from citizenship discrimination under the Immigration and Nationality Act [FN100] (i.e., citizens and nationals of the U.S., lawful permanent residents (holders of "green cards"), refugees, asylees, and legalization beneficiaries). Thus, a deemed export may occur whenever a U.S. company employs a foreign national (including foreign graduates of U.S. schools) whose job duties require access to controlled technology or source code. All such technology transfers require a license from BXA. Moreover, foreign nationals who violate the deemed export rule may also face severe immigration consequences. [FN101]

However, the constitutional concerns raised in the Bernstein case [FN102] seem to have forced BXA and its advisory agencies to exercise self-restraint in applying the deemed export rule to encryption software and technology. As noted above, the "deemed export" provision is set out in EAR § 734.2(b)(2), but this section specifically excludes "encryption software subject to "EI" controls"; it also contains a note stating "See paragraph (b)(9) for provisions that apply to encryption source code and object code software." Section 734.2(b)(9) in turn defines "export" of encryption source code and object code as including the actual shipment, transfer or transmission out of the U.S. or transfer of such software in the U.S. to an embassy or affiliate of a foreign country. Unlike EAR § 734.2(b)(2), however, this definition does not include "release of technology or source code subject to the EAR to a foreign national." Thus, encryption software is not subject to the deemed export rule by negative implication from EAR § 734.2(b)(9).

What about encryption technology? Until recently, the deemed export rule did apply to encryption technology because nothing in EAR § 734.2(b)(9) implies otherwise. However, the January 14, 2000 and October 19, 2000 regulations made changes essentially eliminating the deemed export rule for encryption technology. [FN103] Specifically, License Exception ENC now "authorizes transfers by U.S. companies of encryption technology controlled under 5E002 to foreign nationals in the United States, (except nationals of Cuba, Iran, Iraq, Libya, North Korea, Sudan or Syria) for internal company use, including the development of new products." [FN104]

But the deemed export rule carve-out for encryption technology applies only to technology controlled under ECCN 5E002. So the rule still applies to other encryption technology under ECCN 5E995. However, because such technology can be exported to any non-embargoed destination without a license, the deemed export rule would only be an issue with respect to nationals from an embargoed destination.

Thus, to summarize, notwithstanding the deemed export rule, a U.S. employer may disclose encryption source code, object code or technology to a foreign national in the U.S. However, for any disclosures of controlled software or technology to a foreign national of an embargoed destination, a "deemed export license" will likely be required. [FN105]


§ 4(a) The Karn and Bernstein Cases

In a 1994 interview, John Gilmore, a co-founder both of the Electronic Frontier Foundation (EFF) [FN106] and the Cypherpunks, [FN107] commented on the conflict between export control laws and the First Amendment. Gilmore observed that while the government has never sought to restrict the export of published books containing source code, electronic representations of cryptography have a far less certain status:

There's a whole continuum between a book about cryptography, a book listing source code, an on-line copy of that book, a piece of actual source code, a piece of binary code stored on diskette, a piece of binary code loaded into a general-purpose computer, and a machine that does nothing but encoding and decoding. Somewhere along that continuum, you go from having full rights to anything you want, to having no export rights. It's not clear where the line should be drawn. The government benefits from leaving this line fuzzy, since people who actually have the right to export are afraid that they don't, and don't do it. [FN108]

Two federal decisions address the constitutional issues underlying Gilmore's observation: namely, whether the export regulations as applied to encryption software and technology violate the First Amendment by requiring licensing approval for a diskette containing crypto source code while allowing the export of a book containing the same source code in text form.

In Karn v. Department of State, [FN109] the late Judge Charles R. Richey, writing for the D.C. District Court, not only deferred to the government's expertise in making CJ determinations but also relied heavily on the "political question" doctrine in rejecting Karn's "meritless" constitutional claims. In sharp contrast, Judge Marilyn H. Patel of the Northern District of California fully embraced the plaintiff's argument in Bernstein v. Department of State [FN110] that source code is protected speech for purposes of First Amendment analysis, while evincing little concern over the "political question" doctrine.

The Karn Case. In February 1994, Phil Karn, a computer engineer with several cryptographic inventions to his credit, submitted a CJ request for Bruce Schneier's book "Applied Cryptography--Protocols, Algorithms, and Source Code in C." [FN111] The book included over 100 pages of fully documented source code of DES, FEAL, IDEA and several other powerful encryption algorithms, written in the C programming language. On March 2, 1994, DTC determined that the book was not subject to ITAR jurisdiction since "the item is in the public domain." However, DTC further stated that the ruling covered only the book and not a source code disk available from the author containing these algorithms, additional algorithms, several complete cryptographic systems, several programs for random number generation, and other useful cryptographic materials (including a copy of the ITAR).

Karn then submitted a second CJ request for a source code disk that he had prepared containing only the source code appearing in the published book. DTC denied the CJ, ruling that the disk was a defense article under Category XIII(b)(1) of the ITAR. Specifically, DTC rejected Karn's argument that the text files on the disk were identical to the printed source code, on the grounds that "[e]ach source code listing has been partitioned into its own file and has the capability of being easily compiled into an executable subroutine." Noting that the disk contained source code listings for algorithms that would not be exportable if they were implemented in a software program, DTC concluded that the "thousands of lines of easily executable code contained on the subject disk...added value to any end-user."

Next, Karn filed an administrative appeal reaffirming his earlier argument that the source code in printed and electronic form were identical and adding two more arguments: First, that the disk itself is in the public domain because it is available by mail order for a nominal charge and second, that DTC's decision violates the First Amendment, which protects the freedom of speech and of the press "regardless of medium (diskette or printed textbook)." On October 7, 1994, State reaffirmed DTC's earlier decision that the source code disk was a defense article under Category XIII(b)(1).

Critics branded DTC's decision as "utterly moronic" [FN112] and as demonstrating "Alice in Wonderland stupidity." [FN113] A year later, Karn filed a federal lawsuit alleging that the CJ denial was an abuse of administrative discretion under the Administrative Procedure Act ("APA") and raising First and Fifth Amendment claims as well. The State Department then moved to dismiss Karn's challenge on the grounds that AECA precluded judicial review of CJ determinations and also moved for summary judgment of all of Karn's constitutional claims. In a decision dated March 22, 1996, Judge Richey dismissed the APA claim as nonjusticiable and granted defendant's motion for summary judgment.

Karn had sought to overcome the AECA's bar on judicial review by arguing that it covered only the designation of items as defense articles and not the interpretation and application of USML categories to specific items. Judge Richey not only rejected this position as "strained and unreasonable," but held that any ambiguities regarding this issue should be resolved in the government's favor given the objectives of the arms export laws and the lack of merit in Karn's constitutional and administrative claims. Indeed, Judge Richey refused to involve his court in reviewing the fundamental question at issue in Karn's constitutional challenge--whether the proliferation of cryptographic products harms national security--because he considered this a foreign policy decision most appropriately left to the President (under the so-called "political question" doctrine).

Karn appealed Judge Richey's ruling but soon after BXA issued the December 30, 1996 regulations, resulting in a remand to the district court. The parties then agreed that Karn would file a commodity classification request under the new rules. On August 22, 1997, BXA determined that Karn's diskette was subject to export controls under ECCN 5D002 and thereafter formally denied Karn's license application, thereby setting the stage for a return to federal district court to determine the effect of the new rules on Karn's claims. However, following the publication of the January 2000 regulations, which authorizes the export of publicly available source code, Karn allowed his case to be dismissed as moot. [FN114]

The Bernstein Case. The facts in Bernstein resemble those in Karn in several respects: On June 30, 1992, Daniel Bernstein, then a Ph.D. candidate in mathematics at Berkeley working in the field of cryptography, submitted a CJ request for a "zero delay private-key encryption system" that he named "Snuffle." The request covered both an academic paper describing the Snuffle system and the Snuffle algorithm in source code form. DTC did not rule on the paper but determined that the source code was a defense article. Bernstein submitted a second CJ request asking for separate determinations of the academic paper, other explanatory materials and the source code, to which DTC responded by classifying all of these items as defense articles (although during the lawsuit it recanted on the paper and explanatory information). After exhausting his administrative appeals, Bernstein filed a federal lawsuit raising the same (plus some additional) constitutional claims as Karn. The government then moved to dismiss on the grounds that these issues were nonjusticiable.

On April 15, 1996, Judge Patel denied the defendant's motion to dismiss, finding that even though the Arms Export Control Act ("AECA") precludes judicial review of CJ determinations, it does not bar judicial review of Bernstein's constitutional claims, provided that they were "colorable" (i.e., not wholly insubstantial or frivolous or raised only for jurisdictional purposes). In so finding, Judge Patel rejected the government's argument that source code is akin to unprotected conduct, instead reaching the unprecedented conclusion that "(f)or the purposes of First Amendment analysis,... source code is speech."

In December, 1996, Judge Patel granted plaintiff's motion for summary judgment and held that the licensing requirements for cryptographic software and related technical data as set forth in Category XIII(b) of the USML were an unconstitutional prior restraint of protected speech in violation of the First Amendment. [FN115] The court rejected the government's attempt to justify the prior restraint on national security grounds and instead evaluated the ITAR licensing scheme in light of three procedural safeguards required by the First Amendment: A reasonable time limit for licensing decisions, prompt judicial review, and a government burden of defending its decision in court. Judge Patel concluded, "the ITAR scheme, a paradigm of standardless discretion, fails on every count."

Within a few days of this ruling, BXA issued its new regulations and Bernstein amended his complaint to include them. In August, 1997, Judge Patel ruled that the EAR encryption regulations also lacked the required procedural safeguards set forth in Bernstein II and held, in unusually broad language, that "all rules, policies and practices" of the EAR "as they apply to or require licensing for encryption and decryption software and related devices and technology" violate the First Amendment and shall not be applied to plaintiff's "publishing of such items, including scientific papers, algorithms or computer programs." [FN116] In reaching this conclusion, Judge Patel scoffed at the government's attempt to characterize encryption software as subject to export controls because of its "functional capacity" rather than because of its "informational value" [FN117] or to differentiate between printed material setting forth encryption source code and such code in electronic form. The court not only rejected these distinctions but also characterized the exception for printed materials as "irrational" (because it makes little or no sense to distinguish paper and electronic publication) and "administratively unreliable" (because BXA reserved the right to control scannable source code in printed form). Finally, the court enjoined the government from threatening, prosecuting, or bringing a civil action against Bernstein or others using or publishing Snuffle and related materials.

In the weeks following the decision, the district court issued an emergency stay of its injunction pending appeal, which had the legal effect of blocking Bernstein or anyone else from posting Snuffle on the Internet. (In fact, Snuffle has been widely available on the net for years and was apparently distributed soon after the decision via a listserv to "hundreds of thousands" of non-U.S. email addresses.) [FN118] The government appealed the district court's opinion, [FN119] and in May 1999, the Ninth Circuit affirmed the lower court's decision. [FN120] Thereafter, the government petitioned the Ninth Circuit for a rehearing of the case, and in September 1999, the Ninth Circuit granted the government's request to rehear the case en banc. [FN121] However, in January 2000, the court remanded the case to the original three-judge panel for further consideration in light of the January 14, 2000 regulations.

Comment. In analyzing how the two courts reached contrary decisions notwithstanding the similarity of the underlying facts, perhaps their most striking disagreements concern the appropriate standard to apply in determining whether the ITAR and EAR validly restrict the plaintiffs' First Amendment rights and how the "political question" doctrine bears on this decision. The Karn court looked at the rationale for the regulation and found that if the regulation is content-based, then it is "preemptively invalid," whereas if the regulation is content-neutral, then the government may justify the regulation under the test developed in U.S. v. O'Brien, 391 U.S. 367 (1968). [FN122] In Karn, the court held that the AECA regulation was content-neutral and passed the narrowly tailored prong of the O'Brien test. Judge Richey treated the plaintiff's narrowly tailored argument as really a disagreement over the policy judgment of the President, which was not subject to review under the "political question" doctrine. He also found that the regulations did not violate Karn's due process rights because there was a reasonable fit between the governmental purpose and the means chosen to advance that purpose.

In Bernstein, the court focused instead on the nature of the matters regulated, i.e., whether the encryption system constituted "pure speech" or "conduct." Because Judge Patel found no meaningful difference between computer languages (particularly high-level languages such as source code) and foreign languages (because both "participate in a complex system of understood meanings within specific communities"), she concluded that the Snuffle encryption system was protected speech and rejected the O'Brien test (but applied the test anyway to determine whether Bernstein had a colorable constitutional claim). While Judge Patel acknowledged the "political question" doctrine, she did not share Judge Richey's reluctance to involve the courts in reviewing the constitutionality of the ITAR.

Despite these differences, there are also some substantial areas of agreement between the two courts. First, the Karn and Bernstein courts both agree that books, academic writings, and papers are outside the scope of the AECA and are "protected speech" under the First Amendment. Therefore, both plaintiffs may export, publish or sell these written materials without a government license. Second, both courts agree that the AECA bars judicial review of the Administration's designation of items as defense items, but does not bar constitutional claims. Finally, both courts state that if the O'Brien test applies, the government has the constitutional power to regulate the export of cryptographic software and that the ITAR and EAR do further an important or substantial government interest.

How important are these cases to exporters of commercial encryption systems? With the recent liberalizations, these constitutional questions become largely academic. But even under the older rules, it is unlikely that either case would ultimately have had a major effect on commercial exporters of encryption products. Obviously, if either plaintiff had prevailed on First Amendment grounds, then the ITAR or EAR would have been struck down at least insofar as they regulated the dissemination of encryption source code. But most (if not all) of the encryption source code at issue in the Karn case is already available on the Internet from non-U.S. sites. Moreover, most commercial encryption products are sold in object code form, and while Judge Patel's decision does not reach the issue of whether "low level" languages constitute speech, the analogy between machine-readable code and foreign languages is not very compelling. Thus, even if plaintiffs had prevailed on their First Amendment claims, an affirmative decision may have had limited impact on commercial exporters unless the court struck down encryption controls in such sweeping terms as to eliminate the need for licenses to export any and all encryption software products.

§ 4(b) Crypto with a Hole

The term "crypto with a hole" has been used to denote a method for circumventing export controls by selling a product that lacks encryption functionality at the time of export, yet allows a foreign customer to insert cryptographic components overseas. The name derives from a hardware scenario in which a manufacturer removes the security chip from a secure telephone but equips the phone with an empty chip socket. Arguably, the phone avoids EI controls because it is not a cryptographic device as shipped from the U.S. Prior to the jurisdictional transfer from the State Department to the Commerce Department, DTC rejected this argument, treating a phone with an empty chip socket as functionally equivalent to a phone with a security chip and imposing the same licensing requirements on both items. [FN123]

The Executive Order mandates the transfer of "[E]ncryption products that presently are or would be designated in Category XIII" of the USML. [FN124] Accordingly, it is reasonable to assume that past interpretations regarding the scope of Category XIII as applied to crypto with a hole scenarios continued to apply under the EAR. In fact, in language reminiscent of Category XIII(b)(5), ECCN 5D002 specifically states that "5D002.a controls 'software' designed or modified to use 'cryptography' employing digital or analog techniques to ensure 'information security'." Plus, the recent amendments to the EAR enacted specific rules controlling "open" and "closed" cryptographic interfaces.

In discussing BXA licensing policy regarding "crypto with a hole," it is useful to distinguish between software that is designed to have encryption inserted via a programming interface and software with a general programming interface that is not encryption-specific, even though it may incidentally permit the insertion of various functions including encryption. While the former are EI-controlled, it does not appear that BXA would control the latter for EI reasons.

§ 4(b)(1) Crypto APIs

A Cryptographic Application Programming Interface (Crypto API or CAPI) provides various applications (word processors, spreadsheets, and databases) with a means of requesting encryption-specific services (key management, authentication, data confidentiality) from an operating system (Windows, UNIX, Macintosh). These encryption services may be bundled with the operating system product or supplied as add-ons in the form of a cryptographic library (a collection of cryptographic routines stored in a file). In this scenario, the operating system is the telephone with the empty socket and the cryptographic library is the security chip that plugs into the socket and provides the encryption functionality.

A programming interface may be either documented or undocumented, and may be either open or closed. A documented programming interface is publicized for the general use of third party programmers, so that they may implement the specific service with features of their own choosing. An undocumented programming interface is not publicized, and in some cases might be intentionally obscured in the program code, in order to make it difficult for anyone to access the interface except as intended by the application and its developers. Undocumented programming interfaces exist solely for the original developer's coding purposes--to allow them to implement a single service or feature in the product in a specific way.

In addition, CAPIs can be "closed" through technical means that prevent unauthorized parties from freely inserting cryptographic libraries and making them available to applications through the API. For example, the CAPI can be designed such that it will only recognize cryptographic modules or libraries that have been digitally signed by the original developer of the operating system.

CAPIs usually have been undocumented or closed due to U.S. export restrictions. If they were fully documented and open, it would be impossible to prevent third party programmers from implementing encryption services that would require EAR licensing if shipped as part of the operating system. [FN125] Accordingly, just as BXA restricts "crypto with a hole" (because it allows overseas vendors to plug-in the crypto of their choice), it also restricts fully documented or open CAPIs.

In June 1995, DTC set a precedent by approving a commodity jurisdiction transfer for Microsoft's CryptoAPI, a closed CAPI designed by Microsoft for applications running on the Microsoft Windows NT 4.0 and Windows 95 operating systems. CryptoAPI enables 32-bit Windows applications to take advantage of systems-level access to cryptographic functions such as key generation, key exchange, data encryption and decryption, hashing and digital signatures. [FN126] Vendors implement these functions as separate hardware or software modules called Cryptographic Service Providers ("CSPs"). Applications requiring cryptographic functions make programming calls to Windows, which in turn loads CSPs that have been digitally signed by Microsoft. This signature requirement avoids the "crypto with a hole" problem by effectively limiting access to CryptoAPI to those vendors that agree to comply with US export controls. Because CryptoAPI effectively isolates cryptographic functions from the application and compartmentalizes encryption services in the CSP, both CryptoAPI-enabled applications and the host operating system (Windows NT) were deemed exportable; only the CSPs intended for overseas customers are subject to review.

Microsoft will sign CSPs from U.S. vendors that agree to market the CSPs in the U.S. or Canada only, or in conformity with U.S. export law. For foreign CSP developers, Microsoft was required to obtain an individual export license in order to sign the CSP. The October 19, 2000 regulations, however, fundamentally changed the treatment of foreign developed CSPs. No longer are foreign developed CSPs subject to review and individual licensing before they can be signed. Instead, Microsoft can freely sign or otherwise enable foreign CSPs, subject only to biannual reporting.

§ 4(b)(2) General Purpose APIs

In contrast to Crypto APIs, general purpose programming interfaces are a set of routines used by an application to direct the performance of a wide variety of procedures by the computer's operating system. They may also permit the addition of cryptographic functions even though they are not designed with this in mind. Arguably, EI controls do not apply to general purpose APIs, notwithstanding the fact that a third party programmer might--with some effort--add encryption capability to the program. Indeed, general purpose interfaces such as Win32, MAPI, COM, CORBA, and the UNIX system interfaces are so pervasive that if EI controls prohibited the export of products merely because they contained these open interfaces, many major U.S. software vendors could not sell any of their products overseas.

Over the past few years, a large number of foreign vendors took advantage of U.S. export controls and the existence of these general purpose APIs to develop and market 128-bit add-on products that are fully interoperable with export versions of U.S. products. [FN127] How was this possible? To begin with, the specifications for Internet security protocols are publicly available on the Web and elsewhere. (A good example is SSL, which stands for Secure Socket Layer, a communications protocol for transmitting encrypted data between a Web client and Web server). [FN128] Similarly, free encryption source code and free implementations of 128-bit SSL have been available from numerous non- U.S. sources. [FN129] Thus, a foreign vendor could easily obtain this source code and, utilizing general purpose APIs together with other exportable development tools and programming languages, design and implement cryptographic add-ons that convert U.S. products from 40-bits to 128-bits, without violating U.S. export laws--or so it seems. [FN130] Although EAR § 744.9(a) clearly prohibits U.S. firms from providing "technical assistance" to these foreign vendors in the development or manufacture abroad of EI-controlled encryption software such as 128-bit add-ons, a U.S. firm could point overseas customers to these foreign vendors by adding a hyperlink to certain encryption programs overseas. [FN131] Of course, this business opportunity for foreign companies -- which was created by the existence of U.S. export controls -- is quickly drying up as U.S. vendors begin to take advantage of the recent EAR liberalizations by widely exporting full strength versions of their products.


For several years, United States encryption policy has focused on controlling and monitoring the export of strong encryption. However, this focus has evolved in major ways. Specifically, the export policy has evolved from case-by-case licensing of individual encryption exports, to policies designed to encourage "key escrow" or "key recovery" encryption systems, to broad approvals for exports to certain preferred industry sectors, and finally to nearly free exportability of most products with after-the-fact reporting.

This evolution is due, in large part, to the computer industry's constant pressure on the Administration and on Congress to liberalize U.S. export controls on products with encryption features. Although Congress was ultimately unable to reach agreement on a legislative solution due to strong opposition from the intelligence and defense communities (NSA) and law enforcement (FBI), the threat of Congressional action played an important role in moving the Administration through it various stages of liberalization.

Starting with the Clipper Chip initiative, the Clinton Administration policies that sought to use export policy to encourage the development of key escrow or key recovery products (in order to ensure government access to encrypted data) met with mixed success at best. In fact, because there was never a significant market demand for such products, the Administration's carrot-and-stick approach created a lose-lose situation by driving foreign customers to foreign suppliers, thereby undermining the competitiveness of U.S. software vendors, but without achieving a broad adoption of key recovery technology in worldwide markets.

In contrast, the Administration's late 1998 policy represented a positive step in the right direction. It abandoned a one-size-fits-all approach to key recovery in favor of a series of discrete steps that promote U.S. competitiveness in global markets while also supporting law enforcement and national security concerns. In particular, it allowed U.S. companies to sell strong encryption products to preferred industry sectors, including banks and financial institutions, insurance companies, U.S. multinationals, on-line merchants and the health and medical sectors, and it recognized a new category of "recoverable" products that allow access to encrypted data without sacrificing security or imposing undue burdens on customers. However, this sector-by-sector approach still did not allow companies to sell their products through normal mass-market channels. Instead, individual sales still had to be screened on a case-by-case basis in order to determine whether the particular customer was eligible to receive the strong encryption product.

Finally, in early 2000, the policy finally evolved to the point where companies that rely on mass market distribution models can largely meet worldwide customer demand for strong encryption. However, the policy is still not perfect. Despite the fact that almost all encryption items are exportable worldwide (except to the embargoed destinations), the regulations remain needlessly complex and contain burdensome requirements that impose costs and delays that may keep U.S. companies at a competitive disadvantage vis-a-vis foreign suppliers. This step-by-step liberalization over the past several years has brought us to a state in which strong encryption is now freely available worldwide, and U.S. policy as reflected in the current regulations acknowledges and contributes to such availability. Unfortunately, there has not yet been any attempt to start over and write regulations from scratch that reflect this new reality. Instead, U.S. exporters are still forced to navigate very complex rules and burdensome processes merely to accomplish what is ultimately a permissible export.

Appendix 1

Major Phases of U.S. Policy on Encryption Export Controls

Over the past decade, the U.S. computer and financial service industries have sought administrative and legislative relief to ease export controls on mass- market software products with encryption capabilities. These efforts and the government response thereto, may be somewhat arbitrarily divided into thirteen phases, which are described below.

The Core List Exercise. On November 16, 1990, the White House issued a directive to remove from the U.S. Munitions List (USML) by June 1, 1991 all items contained on the CoCom Core List unless significant U.S. national security interests would be jeopardized. [FN132] In response, the State Department set up working groups to "rationalize" the USML (i.e., either transfer munitions items to Commerce jurisdiction or justify and retain them on the USML), scheduled industry briefings, and sought industry comments and recommendations for the respective working groups.

The software industry responded by proposing the creation of a general exception for all "mass market" software, i.e., software that is generally available to the public through retail outlets and related forms of distribution. In addition, the industry recommended adding a note to the controls on encryption technology clarifying the decontrolled status of certain types, uses, and implementations of encryption in mass-market software products. Then, in late February 1991, the CoCom allies reached tentative agreement to include in the Core List a note decontrolling mass-market software, including software with encryption capabilities. Although the U.S. voted in favor of this approach at the CoCom talks in Paris, it soon became apparent that the U.S. would impose strict unilateral controls on mass- market encryption software. [FN133]

On August 27, 1991, the State Department published a proposed rule transferring several types of cryptographic products and technologies to Commerce jurisdiction and allowing consideration of commodity jurisdiction for other cryptographic goods and technology "on a case-by-case basis when appropriate in light of the national security interests." These transfers were virtually identical to the decontrol allowed under an August 11, 1989 memorandum by James LeMunyon, then Deputy Assistant Secretary of Export Administration. In short, the "Core List" exercise culminated in ITAR revisions that merely codified the status quo.

The Levine Amendment. After failing to obtain any significant administrative decontrol of encryption software through the Core List exercise, the software industry began to seek legislative relief, arguing that strict controls on mass-market encryption software were ill conceived for two reasons. First, they failed to protect national security interests because such programs were readily available abroad. Second, they harmed U.S. economic interests because the time and expense of the export license process, together with the restrictions on DES and other strong cryptographic algorithms, placed U.S. software publishers at a competitive disadvantage relative to foreign vendors. In mid-September, the House Foreign Affairs Committee marked-up a bill to reauthorize the EAA and then Rep. Mel Levine (D-Cal) introduced an amendment to transfer jurisdiction to Commerce over all mass market software programs including those with encryption capabilities. The Bush Administration "strongly opposed" these measures on national security grounds, threatening to veto the reauthorization bill unless this and several other provisions were deleted.

The Software Publishers Association (SPA) Agreement. Due to uncertainty over the enactment of the EAA re-authorization bill and the strong encouragement of several members of Congress, the software industry began meeting with Bush Administration officials to see if their concerns could be addressed within the framework of amended ITAR controls. These discussions culminated in the July 20, 1992 announcement amending the ITAR to allow exports of mass-market software products utilizing RC2 and RC4 at 40-bits for data encryption. In a letter to Congress from then National Security Advisor, General Brent Scowcroft, the Administration also agreed to hold semi-annual meetings with the industry beginning in September 1992. These meetings would periodically review the technical criteria for expedited transfer and consider issues such as whether the algorithms and key lengths needed adjustment to reflect advances in computing power or newly developed encryption technology. Additionally, they would address the much debated issue of whether foreign availability of encryption software harmed U.S. competitiveness.

Although the SPA agreement did not achieve full export decontrol of mass- market software with encryption capabilities, it represented a significant step forward in several respects. First, DTC published explicit export criteria for two named algorithms and expedited CJ review of mass-market software products that incorporated these algorithms for data encryption. This allowed software publishers for the first time to design products knowing that if they satisfied these criteria, their products would be exportable and they would not face a lengthy or uncertain export approval process. Second, the agreement established a mechanism to review foreign availability and to consider future export control reforms.

Clipper 1. On April 16, 1993, the Clinton Administration announced a NSA- designed, tamper-proof encryption chip called the "Clipper" chip together with a split-key approach to escrowing keys. [FN134] The Clipper chip uses a classified algorithm called Skipjack that is purportedly more secure than DES. [FN135] Each chip also contains a unique key which is split into two parts at the time of manufacture for deposit with two government escrow agents, which would provide them to law-enforcement agencies upon presentation of a valid warrant. By combining extremely strong security with a key escrow system, the Clinton administration hoped to balance the competing demands of industry and individuals for highly secure communications, with the needs of law-enforcement agencies, by conducting court-authorized wiretaps of encrypted communications. Moreover, the Administration promised that devices incorporating the Clipper chip would be exportable to most countries.

The Clipper chip is specifically designed for secure voice products. NSA also designed the "Capstone" chip, which not only implements Skipjack and the same key-escrow features as Clipper, but several additional cryptographic functions including a digital signature algorithm. Capstone (later re-named "Fortezza") was mainly intended for use in PCMCIA cards. A complete discussion of the policy and technical issues implicated by the Clipper and Fortezza chips are beyond the scope of this paper. [FN136] For present purposes, two points are relevant. First, U.S. industry expressed considerable skepticism over whether an international market existed for key-escrow products given that the U.S. government would hold the keys for products using Clipper or Fortezza. Second, even though the Administration repeatedly emphasized that it had no plans to mandate key-escrow or ban the use of encryption regimes lacking key escrow, the U.S. software industry insisted that the best proof of this policy would be further liberalization of export controls on non-key-escrow encryption software. With its new emphasis on "recoverable products" and developing the KMI, the Clinton Administration has largely abandoned the failed Clipper policy.

The Cantwell Bill. On November 23, 1993, then Rep. Maria Cantwell (D-WA) introduced legislation (H.R. 3627) transferring jurisdiction over encryption software to Commerce. Specifically, the Cantwell bill would:

1. Confer on Commerce exclusive jurisdiction over encryption software and technology except for programs which are specifically designed for military use;

2. Prohibit validated licenses for exports of mass market software with encryption capabilities (or computer hardware incorporating such software) while maintaining controls on terrorist and embargoed countries; and

3. Permit validated licenses for exports of DES to commercial users in countries where exports of DES have been approved for financial institutions.

Echoing the arguments in support of the Levine Amendment, proponents of the Cantwell bill argued that encryption software and technology are available worldwide and that current controls (including the SPA agreement) prevented the U.S. software industry from meeting ever increasing foreign demand for DES or algorithms of comparable strength. At the same time that supporters of the bill were seeking decontrol of all mass market software with encryption, they also realized that the government could use the availability and "exportability" of Clipper to offset and eventually eliminate commercial demand for products using non-key-escrow encryption schemes. Thus, the Cantwell bill quickly became a twin effort to ease export controls while also containing the government's efforts to deploy Clipper as a key-escrow alternative.

But once again Congress failed to move forward on enacting a re-authorization of the EAA and once again industry and its congressional supporters began informal discussions with the Administration on easing export controls on encryption. Only this time the export issues were further complicated by the Administration's earlier decision to back a key-escrow scheme.

The Gore Letter. On July 20, 1994, Vice President Gore sent a letter to Rep. Cantwell setting forth what appeared to be a new policy on encryption. [FN137] The Gore letter committed the government to working with industry, academia and privacy advocates to develop possible key escrow alternatives to Clipper. According to the letter, an alternative system would have to be implementable in software, firmware, hardware, or any combination thereof; would not rely upon a classified algorithm; and would be voluntary and exportable. In addition, the Clipper alternatives would permit the use of private-sector key escrow agents to give users more choices and flexibility in meeting their needs for secure communications. Finally, the Gore letter stressed that Clipper alternatives must contain safeguards to prevent illegal use or disclosure of escrowed keys.

On August 17, 1995, the Administration announced a new policy of allowing export of key escrow software encryption products with key lengths of up to 64- bits (including DES) provided that certain requirements were met:

1. The key was escrowed with a "certified" escrow agent;

2. The product passed a one-time government review to ensure that it satisfied certain export criteria;

3. Vendors submitted quarterly reports on where the products had been shipped.

The Administration sought to work out key elements of this new policy (including export criteria and the requirements for U.S. key escrow agents) at a series of workshops led by the National Institute of Standards (NIST). [FN138]

Clipper 2. On December 5, 1995 the Clinton Administration introduced a revised version of the proposed key escrow cryptography export proposal, which was quickly dubbed Clipper 2. [FN139] The proposal still prohibited the export of strong cryptographic programs with the exception of 64-bit systems meeting strict key escrow criteria. NIST formulated the following criteria for licensing these systems:

1. Decryption keys would be escrowed with an agent certified by the U.S. government or by foreign governments with which the U. S. has a bilateral agreement covering access to keys;

2. Key escrow products would not decrypt messages or files encrypted by non-escrowed products including products where key escrow mechanisms have been altered or disabled;

3. Law enforcement would have access to decryption keys via either the sender or the intended recipient;

4. The system would be designed to prevent multiple encryption and repeated involvement by the escrow agents for recovery of multiple decryption keys during periods of authorized access; and

5. Key escrow products would be resistant to alterations that would disable or circumvent the escrow mechanism.

The Clipper 2 proposal diverged from the Gore letter in several respects. First, it would have required domestic users to use key escrow systems if they intended to engage in communications internationally. Second, it was silent concerning the privacy safeguards identified by the Vice President. Third, while the Gore letter did not impose any limits on key length, Clipper 2 limited key escrow systems to 64-bits. Finally, Clipper 2 would have required foreign countries to accept United States-based escrow of all keys until bilateral access agreements were entered into.

The Pro-CODE Bill of 1996. Still dissatisfied with the Administration's single-minded focus on key escrow solutions, the software industry again turned to Congress. On May 2, 1996, Sen. Conrad Burns (R-MT), along with five co- sponsors, introduced the Promotion of Commerce Online in the Digital Era (Pro- CODE) bill (S. 1726). This legislation restricted federal and state regulation of the distribution, sale, and export of computer hardware and software containing encryption capabilities. [FN140] Specifically, the Pro-CODE bill would:

1. Grant Commerce exclusive authority to control the export of all computer hardware, software, and technology with encryption capabilities, except for programs designed for military use;

2. Prohibit the Secretary of Commerce from formulating, adopting, and enforcing regulations regarding encryption standards for use by the private sector;

3. Prohibit federal and state government restrictions on the sale or distribution of products or programs with encryption capabilities in interstate commerce;

4. Prohibit federal and state government requirements of mandatory access to decryption keys as a condition of sale in interstate commerce; and

5. Require only general licenses for export or re-export of encrypted software or computer hardware with encryption capabilities, to all but embargoed or terrorist-supporting countries.

Clipper 3. On May 21, 1996, the Clinton Administration announced a revised key escrow policy based on a government-sanctioned key certification system. [FN141] This proposal sought to graft the objectives of key escrow policy onto the emerging public key infrastructure (PKI) for secure electronic communications and commerce (and dubbed the result the "Key Management Infrastructure" or KMI). All users of the KMI would have to ensure government access to their encryption keys through an approved key escrow authority. In effect, the KMI placed the government at the top of a hierarchical PKI, which would approve certificate authorities for operation provided they participated in the KMI scheme. The proposal sets up various incentives for companies to support key escrow by offering government support for the KMI and relaxing export controls for products that supported the KMI scheme.

The December 1996 Regulations (aka Clipper 3.1). On October 1, 1996, Vice President Gore announced an "encryption export liberalization plan" which President Clinton implemented via a Memorandum and Executive Order dated November 15, 1996. [FN142] The Executive Order directed the transfer of jurisdiction over all "commercial encryption products" from the State Department to the Commerce Department. This transfer became effective when the Bureau of Export Administration (BXA) published an interim final regulation on December 30, 1996. [FN143] The regulation placed all commercial encryption products on the Commerce Control List (CCL) [FN144] of the Export Administration Regulations (EAR), [FN145] with the exception of products specifically designed or modified for military use, which remain subject to ITAR controls.

When jurisdiction was transferred from the State Department to the Commerce Department under the December 30, 1996 regulations, 40-bit mass market encryption items continued to be exportable under the General Software Note (GSN). [FN146] However, at the same time, the Commerce Department created the first and only major exception to GSN treatment for mass market software, declaring that mass market encryption software subject to EI controls (i.e. greater that 40-bit software) is excluded from the GSN. As Larry Christensen points out: "The theory underlying the GSN is that there is a subset of software that is so widely available to the general public that it is uncontrollable and therefore the CoCom governments should not impose an unmanageable control burden on the firms subject to such controls." [FN147] The U.S. government convinced it allies in the multilateral CoCom export control agreement to establish this free export treatment for mass-market software based on a recognition that such software could not be controlled effectively, regardless of the nature or the purpose of the controls (which then were based on protection of national security and nonproliferation of weapons of mass destruction). [FN148] When it came to mass market encryption software, however, the U.S. government disregarded this theory and proceeded on the assumption that such software can somehow be effectively controlled. [FN149]

The Commerce Department regulations mirrored the earlier ITAR regime, by providing expedited one-time review in the form of a classification request to BXA (in the place of DTR's expedited CJ transfers) in seven working days for products utilizing either or both of two commercially available data encryption algorithms, RC2 and RC4, [FN150] as long as the encryption keys were 40-bit or less and any public keys used for key exchange were 512-bits or less (and certain other technical requirements are met). [FN151] If a mass-market product does not meet these technical requirements, BXA would process the request in fifteen days as long as the product otherwise qualifies as mass- market software with data encryption capability. [FN152]

The December 30, 1996 regulations somewhat eased export controls on 56-bit encryption by allowing U.S. companies to export such products for the next two years provided that the companies committed to developing and distributing "key recovery" products.

The regulations also introduced the new License Exception KMI, discussed in section 3(b)(3)(iii) above, which permitted "recovery" encryption software and equipment to be released from EI controls and made eligible for a license exception for all non-embargoed destinations. This category included "key escrow" or "key recovery" products in which the keys or other information required to decrypt a message or stored data are kept by a key recovery agent and are accessible to government officials under proper legal authority..

The SAFE Bill. On February 17, 1997, Cong. Robert Goodlatte (R-VA) introduced the Security and Freedom through Encryption (SAFE) bill, H.R. 695. [FN153] The SAFE bill enjoyed broad bipartisan support and eventually gathered over 250 co- sponsors. On May 14, the House Judiciary Committee unanimously approved SAFE with three amendments, including an amendment to narrow a provision of SAFE creating new criminal penalties for the use of encryption to obstruct law enforcement investigations of federal crimes. However, several other committees attempted to, or succeeded at, radically altering the SAFE bill via amendments: [FN154]

On July 22, 1997, the House International Relations Committee approved SAFE by a voice vote, rejecting an amendment offered by Committee Chairman Gilman (R-NY) that would have gutted the bill by allowing the Administration to regulate and/or deny encryption export licenses under a broadly defined "national security exception"; this exception would include law enforcement efforts to combat espionage, terrorism, illicit drugs, kidnapping, and other criminal acts. [FN155]

On September 9, 1997, the House National Security Committee approved an amendment offered by Reps. Weldon (R-PA) and Dellums (D-CA) that would give the President the ability to restrict encryption exports by only allowing Commerce to approve export licenses for encryption products with an encryption strength no greater than a maximum level set annually by the President. This amendment is actually a roll back from current policy in that it would give the Secretary of Defense a new veto power over export control decisions and makes no allowances for exporting strong crypto to banks or foreign subsidiaries. [FN156]

On September 11, 1997, the House Intelligence Committee approved by voice vote an amendment in the nature of a substitute to SAFE that would imposeunprecedented domestic controls on the manufacture and distribution of encryption products. The amendment, offered by Committee Chairman Porter Goss (R-FL) and ranking Democrat Norm Dicks (D-WA), would make it illegal to manufacture or import for sale or use in the United States, after January 31, 2000, any encryption product that did not include features that provided, in response to a court order, immediate access to plaintext data or decryption information. [FN157]

Finally, on September 24, 1997, the House Commerce Committee approved a modified version of SAFE by a vote of 35-6, at the same time rejecting a controversial FBI-backed amendment that would impose domestic controls on encryption. This modified version preserved the export relief provisions of the original SAFE bill and added an amendment calling for the creation of a National Electronic Technologies Center to assist law enforcement in coping with encryption encountered in the course of investigations. The amendment, by Reps. Markey (D-MA) and White (R-WA), also called for a study of the implications of mandatory key recovery, and increased the criminal penalties under SAFE for the use of encryption in the furtherance of a federal felony. [FN158] The defeated amendment, introduced by Reps. Oxley (R-OH) and Manton (D-NY), would require any encryption product manufactured or sold in interstate commerce, or imported into the United States, to include features that permit immediate access (pursuant to appropriate judicial process) to the plaintext of communications or electronic information encrypted by such product, without the knowledge or cooperation of the user. [FN159] A broad coalition of software, hardware, Internet and telecom companies, trade associations and public interest groups worked to defeat the amendment [FN160] and many viewed the outcome as a major set back to the FBI's plans for domestic controls. [FN161]
Following this spate of activity with respect to the SAFE bill, the bill ultimately stalled and died in the Rules Committee. Before the SAFE bill could move to the floor, the Rules Committee would have had to first reconcile these very different versions of the bill, a daunting task that was never completed.

The Secure Public Network Bill. On June 17, 1997, Senators John McCain (R-AZ) and Bob Kerrey (D-NE) introduced the Secure Public Networks Act of 1997, S. 909. [FN162] The bill resembled draft legislation proposed earlier this year by the Clinton Administration. Unlike the Pro-CODE bill which it replaced, the McCain-Kerrey bill offered only limited export control relief in the form of a License Exception for 56-bit non-recovery encryption products and "expedited review" for export of strong crypto products to banks and certain other institutions. More significantly, the bill would create a number of regulatory incentives and legal penalties designed to compel individuals and corporations to adopt federally-approved key recovery systems. These include:

Requiring the use of key recovery systems in order to obtain the public key certificates needed to conduct secure electronic commerce;
Offering safe harbor liability protections to encourage the use of "federally licensed" key recovery agents; and
Requiring key recovery for all secure networks built with any federal funds -- including the Internet II project and most university networks.
In addition, the bill would allow law enforcement to access key information without a court order, using a subpoena issued without notice. It also creates new federal crimes dealing with the knowing use of encryption in furtherance of a crime and the unlawful disclosure of key recovery information.

On June 19, 1997, the Senate Commerce Committee approved the McCain-Kerrey bill (as a substitute for the Pro-CODE bill) with amendments, including several offered by Senator First (R-TN) that would strengthen the procedural requirements for the use of subpoenas and require the government to demonstrate that key recovery will work before such systems are deployed. [FN163]

The E-PRIVACY Bill. On May 12, 1998, Senators John Ashcroft (R-MO) and Patrick Leahy (D-VT) introduced the Encryption Protects the Rights of Individuals from Violation and Abuse in Cyberspace (E-Privacy) bill (S-2067). [FN164] The bill:

Allows the domestic use of strong encryption without any "key recovery" requirement;
Eases export controls to allow U.S. firms to sell their encryption products overseas;
Strengthens protections from government access to decryption keys; and
Assists law enforcement in obtaining information on criminal activity by establishing a National Electronic Technology Center (NET Center) to serve as a focal point for information and assistance to Federal, State and local law enforcement authorities.
The September 1998 Regulations (aka The Financial Regulations). U.S. export policy has long favored banks and financial institutions by allowing them access to strong encryption products. [FN165] For example, prior to the December 30, 1996 regulations, DTC had a policy of granting USML licenses for strong encryption, on a case-by-case basis, to banks and financial institutions (whether U.S.-controlled or foreign) to secure financial data (whether in internal or interbanking communications). Moreover, both the ITAR, [FN166] and the December 30, 1996 regulations [FN167] included a specific exemption for cryptographic equipment and software "specifically designed ... for use in machines for banking or money transactions" including ATMs and point of sale terminals for the encryption of interbanking transactions.

Over the past several years, the Administration has issued a number of statements promising to continue and extend this favorable treatment for banks and financial institutions. For example, in his October 1, 1996 announcement, Vice President Gore indicated that "40-bit mass market products will continue to be exportable" while "longer key lengths will continue to be approved for products dedicated to the support of financial applications." [FN168]

Seven months later, Undersecretary of BXA, William A. Reinsch announced that BXA would issue regulations allowing the export of "the strongest available data encryption products" to support electronic commerce around the world. "Because banks and other financial institutions are subject to explicit legal requirements and have shown a consistent ability to provide appropriate access to transaction information in response to authorized law enforcement requests, key recovery will not be required for the financial-specific products covered by today's export announcement." [FN169] In August 1997, an anonymous source posted on the Internet a copy of the draft Financial Regulations dated July 25, 1997; thirteen months later, BXA published the Financial Regulations. In large measure, the delay resulted from a combination of interagency disagreements (especially Treasury Department concerns over money laundering) and the Administration's evolving key recovery policy (the July 25th draft would have conditioned the export of strong encryption products to banks and financial institutions on the vendor submitting a satisfactory key recovery business and marketing plan, whereas the final regulations dropped this requirement).

These developments are described in greater detail below. As finally published on September 22, 1998, the financial regulations authorized, after a one-time technical review, exports and reexports under License Exception KMI of:

Non-recoverable financial-specific encryption software for financial applications to secure financial transactions to any end-user in all but the embargo/ terrorist (T7) nations--Cuba, Iran, Iraq, Libya, North Korea, Sudan and Syria; and
General purpose non-recoverable (and non-voice) encryption software for banks and other financial institutions located in the 44 countries that are either members of the international anti-money laundering accord, the Financial Action Task Force (FATF), or have enacted anti-money laundering laws. [FN170]
The Financial Regulations also explicitly made 40-bit DES eligible for the 15-day review and release from EI controls. [FN171]

The December 1998 Regulations. On September 16, 1988, a week before publishing the "financial regulations" discussed above, the White House announced a broader series of measures easing controls on a wide range of products and end-users. As noted in the White House press release, these steps were the result of "several months of intensive dialogue between the government and U.S. industry, the law enforcement community and privacy groups that was called for by the Vice President and supported by members of Congress." [FN172]

Prior to the new policy announcement and regulations, BXA posted a fact sheet on "Licensing Practices for Encryption Items" listing certain items and classes of end users that "generally have been given favorable consideration in licensing review, subject to appropriate conditions" including the following:

Certain 40-bit encryption software, which are not eligible for the mass market provisions of License Exception TSU (see Supplement 6 to Part 742 of the EAR), and certain 40-bit encryption hardware, being exported to most end users;
Certain 56-bit encryption commodities/software being exported to U.S. subsidiaries or to "financial institutions";
Certain encryption commodities/software that implement the Secure Electronic Transactions (SET) protocol or other "financial-specific," products being exported to most end users to secure electronic commerce transactions/communications;
Certain 128-bit encryption commodities/software that have a recovery feature, being exported to U.S. subsidiaries, "financial institutions", or to certain (commercial) foreign end users;
Certain 128-bit encryption commodities/software for "home banking" purposes as long as the product is limited to bank/client communications/transactions and does not allow client-to-client communications/transactions.
Certain 128-bit general purpose encryption commodities/software being exported to "financial institutions" for inter/intra banking, provided the manufacturer of the exported product has an approved KMI Commitment Plan;
Certain 128-bit general-purpose encryption commodities/software being exported to U.S. subsidiaries provided the product allows for recovery of the plaintext.
The December 1998 regulations increased the transparency of U.S. policy by codifying and making eligible for license exceptions many of those exports that the government had been approving under Encryption Licensing Arrangements and individual licenses.

Thus, the regulations created a new License Exception ENC, which expanded the sector relief previously available for banks and financial institutions to include several additional sectors. The expanded sector relief included U.S. multinationals and their subsidiaries, health and medical organizations, and "online merchants" engaged in e-commerce with the public at large.

Subsidiaries of US Multinationals. License Exception ENC allowed the export of encryption products of any key length, with or without key recovery, to subsidiaries of U.S. companies worldwide (except to the T7) for the protection of internal business operations. [FN173] Semi-annual post- facto reporting was required. BXA also committed to extend favorable treatment to "strategic partners" of U.S. multinationals under individual licenses.
Insurance Companies. Insurance companies were adding to the definition of "financial institution." The result was license exception treatment to insurance companies headquartered in the 44 FATF nations listed in the September 1998 amendments to the EAR relating to banks and financial institutions.
Health and Medical Organizations. License Exception ENC also permitted the export of encryption products of any key length, with or without key recovery, to organizations whose primary purpose is the lawful provision of "medical or other health services" in the 44 FATF countries, not including biochemical firms, pharmaceutical firms and military agencies, except by individual license. Semi-annual post-facto reporting was also required. Unlike the treatment given to banks and financial institutions, this relief did not include branches or affiliates outside the 44 countries.
On-Line Merchants. Finally, License Exception ENC permitted export of client-server applications tailored to on-line transactions (e.g., SSL-based products), with any encryption algorithm and with any key length, with or without key recovery, to "on-line merchants" in the 44 FATF countries. BXA defined an "on-line merchant" as "an entity regularly engaged in lawful commerce that uses means of electronic communications (e.g., the Internet) to conduct commercial transactions." [FN174] And the end-use was limited to "the purchase and sale of goods and software; and services connected with the purchase or sale of goods and software." [FN175] However, foreign merchants (non-U.S. owned and controlled) that sell items and services controlled on the U.S. Munitions List were excluded from this policy (although for merchants having separate business units, only those units selling munitions items were excluded). Semi-annual post-facto reporting was required.
In addition, the regulations extended license exception treatment for 56-bit DES and equivalent products. Under the rules, License Exception ENC covered hardware and software exports of up to "56-bit DES and equivalent" to all users and destinations (except to the T7) after a one-time technical review. [FN176] The regulations eliminated the key recovery plans and renewals of existing key recovery plans that were required under the December 1996 regulations. This release included 56-bit DES and several other popular commercial algorithms such as RC2, RC4, RC5 and CAST with 56-bit keys as well as products with asymmetric key sizes up to 1024-bits. Semi-annual post-facto reporting of quantity and ECCN was required for exports to non-Wassenaar countries and for any exports of non-mass-market products to military and government end-users.

The regulations also created a new category of encryption products -- so- called "recoverable products" (products or systems that allow access to plaintext through a network administrator who is independent of the user) [FN177] -- which would be approved for export under Encryption Licensing Arrangements, with no limit on key length or algorithm, to "foreign commercial firms" located in 42 countries (the 44 FATF countries minus Croatia and Singapore). For firms based in 21 of these countries, [FN178] exports of "recoverable" products would be permitted to their subsidiaries and branches, located worldwide (except the T7). The end-use of these products is limited to the protection of internal company proprietary information (i.e., the products could not be sold for individual use). This policy of approval excluded exports to service providers (i.e., telcos and ISPs) as well as any commercial firms or separate business units of commercial firms engaged in the manufacturing and distribution of products or services controlled on the U.S. Munitions List. Finally, unlike the other parts of the policy announcement, this relief is not automatic upon the publication of the regulations. This is merely a policy of approval for ELAs. Thus, exporters must apply for and receive ELAs or individual export licenses covering the export of specific products to foreign firms or groups of firms in order to take advantage of this policy. [FN179]

Finally, the regulations relaxed the requirements for "key-recovery" products by eliminating the requirement to name key recovery agents and have them approved according to the strict criteria of Supplement 5 to Part 742 (which was removed from regulations).

The January 2000 Regulations. On September 16, 1999, the White House announced a major liberalization of U.S. encryption controls. The basic principles of this announcement were:

Any encryption item will be exportable to any non-government end-user in any country (except the seven embargoed destinations).
Any encryption item that fit into the new category of "retail encryption" would be exportable to any end-user in any country (except the T7).
64-bit encryption items that met the definition of "mass market" would be exportable.
U.S. based foreign employees of U.S. companies could receive encryption items and technology without a license (thereby eliminating the deemed export rule for these foreign nationals).
Following the announcement, the Commerce Department and other agencies involved began a process of close consultation with industry in developing the regulations to implement this announcement. Two "discussion drafts" of the regulations were released -- one dated November 19, 1999 and one dated December 17, 1999 -- with industry groups and others being given the opportunity to comment.

On January 14, 2000, the U.S. government issued the long-awaited export regulations that allowed most U.S. companies to ship strong encryption products to their customers worldwide. [FN180] As was indicated in White House announcement, the new regulations divided strong encryption products into two categories: "retail" and "non-retail". Both categories were made exportable under "License Exception ENC," but the scope of the exportability differed. Strong "Retail" strong encryption products could be exported to any end-user anywhere in the world (except those in the embargoed countries). "Non-retail" encryption products were made exportable only to non-government end-users - sales to foreign governments still required an export license.

Before any U.S. strong encryption product could be exported under License Exception ENC, it required to undergo a one-time technical review. Existing products that had previously gone through a BXA review under the old rules were automatically made eligible for immediate export to any non-government end- user. But in order to determine whether a product could be classified as "retail," so that it could take advantage of the broader exportability, it had to go through a new review and classification. And all new products (or new versions in which the encryption capabilities have been changed) must be reviewed and classified prior to export.

In addition to the prior technical review, exports of strong encryption products under License Exception ENC were subject to reporting requirements. However, in response to industry feedback, several exceptions to the reporting requirements were included in the final regulation For example, the reporting requirements for encryption products generally involved only reporting the first export that was made (usually to a distributor or reseller). And even for direct exports to end-users, there were exceptions for anonymous distributions (such as Internet downloads) and for direct sales of retail products to an individual (as opposed to a commercial entity or other organization). Nevertheless, many exporters complained that the reporting requirements represented a "rollback" for certain exports, since under the previous regulations, direct exports to "banks and financial institutions" under License Exception ENC did not require reporting.

The January 2000 regulations also contained an important exception to the technical review and reporting requirements for exports to U.S. subsidiaries. This meant that under the January 2000 regulations, U.S. companies could freely export any cryptographic hardware, software, source code, or technology to their subsidiaries without any prior technical review by the government, and there were no post-export reporting requirements. This meant that code and technology could be freely distributed within a U.S. company's subsidiaries worldwide. However, any products or other items created using the U.S.-origin content would remain subject to U.S. export controls.

Finally, the regulations created new rules specific to license exception eligibility for encryption source code. The regulations created three categories of source code, with a different set of rules for each. "unrestricted" encryption source code (i.e. "open source" code) was released from EI controls and made eligible for License Exception TSU. Publicly available commercial source code remained subject to EI-controls (including reporting requirements and other restrictions), but was generally exportable under license exception ENC. And non-publicly available commercial source code also remained subject to EI controls, required a technical review prior to export, and was exportable under License Exception ENC only to non-government end-users.

While the January 2000 regulations represented a major step forward in the U.S. government's approach to encryption, exporters continued to point out several problems with the rules. One common complaint was that even though the regulations permitted, for the first time, the vast majority of encryption products to be exported to any end-user worldwide (and the remaining "non- retail" products to be exported to nearly every end-user), the regulations remained extremely complex, making permissible exports far more difficult, confusing and costly than they needed to be. Similarly, according to many exporters, maintaining the requirements of pre-export technical reviews and post-export reporting imposed costs and delays that kept U.S. companies at a competitive disadvantage vis-a-vis foreign suppliers. Another complaint with the regulations was that by releasing only "non-commercial" source code from "EI" controls, the rules created an unfair advantage to businesses that rely on an "open source" software development model over U.S. companies that rely on proprietary code.

The October 2000 Regulations. The latest regulatory update to the U.S. encryption export rules was published on October 19, 2000. [FN181] These regulations were focused on several small changes aimed at addressing some of the concerns raised with the January 2000 regulations, and on responding to a move by the European Union to create a "license-free zone for encryption items encompassing the EU, the United States, Canada, and eight additional countries.

As a result, an new category was added to License Exception ENC authorizing the export of any encryption items (except certain cryptanalytic items) to any end-user in the EU+8 countries and to the worldwide subsidiaries of companies and organizations headquartered in those countries. [FN182] U.S. exporters are permitted to ship encryption items under this provision immediately upon submission of a classification request to BXA, rather than having to wait for the review and classification to be completed.

While the post-export reporting requirements were not eliminated, as many exporters had requested, significant new exceptions to the requirements were added. For example, reporting is no longer required for "short-range wireless" encryption or "client Internet appliances and client wireless LAN cards". [FN183] And a major new reporting exemption was added for "retail operating systems or desktop applications ... designed for, bundled with, or preloaded on" single CPU computers, laptops or handheld devices. [FN184]

Additionally, a new provision was added allowing exporters to request that their products be made eligible for de minimis treatment. Encryption items were also made eligible for the "beta test software" provision of License Exception TMP (subject to reporting requirements).

The October 2000 regulations are another improvement in U.S. policy, but it did not address all the issues that were raised by exporters -- including the fundamental problem of complex and sometimes highly burdensome requirements that U.S. exporters must endure in order to make what is ultimately a permissible export. Nevertheless, U.S. exports recognize that these regulations are the last in a long series of liberalizations that have represented a dramatic evolution of U.S. policy over the course of the Clinton Administration.

FN1. Associate General Counsel, Microsoft Corporation (irar@microsoft.com). My thanks again to Tom Albertson of Microsoft and to Ben Flowe of Berliner, Corcoran, and Rowe for their many contributions over the years to my understanding of export licensing and crypto policy issues.

FN2. Corporate Attorney, Microsoft Corporation (mhintze@microsoft.com).

FN3. 65 Fed. Reg. 62909, et. seq. (October 19, 2000), available on-line at http://www.bxa.doc.gov/Encryption/pdfs/EncryptionRuleOct2K.pdf.

FN4. This section is largely derived from Schneier, APPLIED CRYPTOGRAPHY-- PROTOCOLS, ALGORITHMS, AND SOURCE CODE IN C, John Wiley and Sons (1993) (hereinafter "Schneier"); and Fahn, "Answers to Frequently Asked Questions About Today's Cryptography," RSA Laboratories (1993). For a comprehensive study of U.S. encryption policy, see the report prepared by the Computer Science and Telecommunications Board (CSTB) of the National Research Council (NRC) "Cryptography's Role in Securing the Information Society" (May 30, 1996) (hereinafter, the "CRISIS report"). Chapter 4 includes an excellent (but dated) overview of relevant export controls; it is available on-line at http://www.eff.org/pub/Privacy/Key_escrow/Clipper_III/9605_nrc_cryptopolicy_draft.report. A useful source for worldwide crypto policy is Crypto Law Survey, found at http://cwis.kub.nl/~frw/people/koops/lawsurvy.htm.

FN5. On the history of early cryptographic systems, see generally Kahn, THE CODEBREAKERS: THE STORY OF SECRET WRITING, Macmillan (1967).

FN6. The RSA public-key system; see § 2(d), infra.

FN7. See Schneier, 129-138. This discussion assumes that the only way to break the cipher in question is a brute force search of the entire key space. In fact, many ciphers have flaws that make them susceptible to other types of attacks as well.

FN8. See "Minimal Key Lengths for Symmetric Ciphers to Provide Adequate Commercial Security," A Report by an Ad Hoc Group of Cryptographers and Computer Scientists, Business Software Alliance (1996), available on-line at http://theory.lcs.mit.edu/~rivest/bsa-final-report.pdf.

FN9. See "French Hacker Cracks Netscape Code, Shrugging Off U.S. Encryption Scheme," Wall Street Journal, August 17, 1995 at B3.

FN10. See "40-bit Crypto Proves No Problem," News.com (Jan 31, 1997), http://www.news.com/News/Item/0,4,7483,00.html.

FN11. See "48-bit Crypto Latest to Crack," News.com (Feb. 11, 1997), http://www.news.com/News/Item/0,4,7849,00.html.

FN12. See "Group Cracks 56-bit Encryption," News.com (June 17, 1997), http://www.news.com/News/Item/0,4,11678,00.html.

FN13. See the EFF press release, "EFF Builds DES Cracker that Proves that Data Encryption Standard Is Insecure" (July 17, 1998), http://www.eff.org/descracker.html. For a more detailed account, see CRACKING DES: SECRETS OF ENCRYPTION RESEARCH, WIRETAP POLITICS AND CHIP DESIGN, Electronic Frontier Foundation (1998).

FN14. See Federal Information Processing Standards Publication 46, "Data Encryption Standard" (Jan. 15, 1977). The U.S. government recently published a notice of a request for candidate encryption algorithms for an Advanced Encryption Standard ("AES") to replace DES; the new algorithm must support key sizes of 128 bits, 192 bits, and 256 bits. NIST hosts a Web page devoted to "AES Development Efforts" at http://csrc.nist.gov/encryption/aes/aes_home.htm.

FN15. For example, according to Schneier, as early as the 1980s, the East German company Robotron fabricated hundreds of thousands of DES chips for use in the Soviet Union. See, Schneier, 237.

FN16. In 1992, in the course of Congressional deliberations on whether to liberalize export controls, the Software Publishers Association (SPA) reached an agreement with the Bush Administration to ease the export restrictions onRC2 and RC4 as long as their key size did not exceed 40 bits. As noted previously, a 56-bit key generally is considered more secure than a 40-bit key because the former provides over 65,000 as many key values as the latter. Thus, foreign customers tended to reject products incorporating RC2 and RC4 at 40- bits as a replacement for DES.

FN17. See "Cipher Probe: Popularity Overseas of Encryption Code Has U.S. Worried," Wall Street Journal, April 28, 1994 at A1.

FN18. Schneier, at 437. In March 1996, Zimmermann founded Pretty Good Privacy, Inc., to market commercial versions of PGP. Although PGP remains freely available for non-commercial use, the commercial version is not "in the public domain" given that it is copyrighted and utilizes patented RSA algorithms. RSA Data Security Inc. and Network Associates, Inc. (which acquired PGP, Inc. in December 1997) recently settled the former's long running patent infringement and copyright violation lawsuits against PGP Inc.

FN19. See § 4(a), infra.

FN20. 22 CFR Parts 120-130, as amended by 58 Fed. Reg. 39280 (Jul. 22, 1993). The legislative authority for the ITAR is § 38 of the Arms Export Control Act, as amended, Pub. L. No. 90-629, 82 Stat. 1320 (codified at 22 USC § 2778). The USML is found at 22 CFR § 121.1.

FN21. For a more detailed treatment of the ITAR regulations, see an earlier version of this article, Rubinstein, "Export Controls on Encryption Software," COPING WITH U.S. EXPORT CONTROLS, 1996, 309, at 318-322 (PLI Com. Law & Practice Course Handbook Series No. A4-4512, 1996).

FN22. The Executive Order 13026, 61 Fed. Reg. 58767 (November 15, 1996) is available on-line at http://www.bxa.doc.gov/Encryption/eo13026.htm.

FN23. 61 Fed. Reg. 68572 (Dec. 30, 1996) (hereinafter the "December 30, 1996 regulations"); available on-line at http://www.bxa.doc.gov/Regulations/encreg.pdf.

FN24. 61 Fed. Reg. 68573.

FN25. EAR § 742.15

FN26. Id.

FN27. EAR § § 732.2(b), 734.3(b)(3), and 734.7(c); see also ECCN 5D002, note.

FN28. EAR § § 732.2(d), 732.2(e)(2), and 734.4(b).

FN29. EAR § 768.1(b).

FN30. EAR § 740.13(d)(2) and Supplement No. 2 to Part 774, Note; see also ECCN 5D002, note.

FN31. EAR § 744.9(a).

FN32. EAR § 734.2(b)(9)(ii).

FN33. Executive Order 12981, Section 1, 60 Fed. Reg. 62981 (Dec. 8, 1995).

FN34. Executive Order 13026, Section 1(b)(1), note 22, supra, amending Executive Order 12981 to add a new section 6; see also EAR § 750.3(b)(2)(5).

FN35. The old saw is that "NSA" stood for "No Such Agency." For more on the NSA's history, see generally, Bamford, THE PUZZLE PALACE: A REPORT ON AMERICA'S MOST SECRET AGENCY, Penguin (1983).

FN36. William P. Crowell, (the retired) NSA Deputy Director, testified at several Congressional hearings on encryption export bills.

FN37. See § 4(a), infra. Indeed, in a court filing in one of these cases, the NSA Deputy Director described the agency's role in the ITAR export review process as follows:

The National Security Agency is the agency with technical expertise for evaluating whether cryptographic devices or software fall within category XIII(b)(1) of the USML. NSA makes recommendations regarding two aspects of the export review process: (1) actual license applications to export a commodity and (2) determinations as to which agency of the government has jurisdiction to license the export of a commodity, i.e., commodity jurisdiction determinations. License applications for the permanent or temporary export of cryptographic products are forwarded by the State Department to NSA for an assessment of whether the approval of an export license could have a negative impact on the national security interests of the United States. In making this assessment, NSA considers several factors including the sensitivity of the technology proposed for export, and the declared end-user and end-use of the commodity. In addition, through the commodity jurisdiction process, NSA provides the Department of State with technical advice to determine whether the commodity is... covered under Category XIII(b)(1) of the... USML.

See Karn v. U.S. Department of State, No. 95-1812 (CRR) (D. D.C. Mar. 22, 1996), Declaration of William P. Crowell, paragraphs 5 and 6, available on-line from Phil Karn's Web page at http://people.qualcomm.com/karn/export/crowell.html.

FN38. See Christensen, "Technology and Software Controls Under the Export Administration Regulations," COPING WITH U.S. EXPORT CONTROLS, 1997, 365 (PLI Com. Law & Practice Course Handbook Series No. A4-4527, 1997).

FN39. EAR § 734.3(b)(3). See generally Supplement 1 to Part 734.

FN40. See note 27, supra. This elimination of public domain treatment does not apply to encryption technology, however.

FN41. EAR § 734.3(b)(2) and (3), note. Taking advantage of this exception for printed materials, In June 1997, Warthman Associates published a book containing printed source code for PGP 5.0. A group of non-U.S. volunteers group then scanned all 12 volumes into a computer using OCR (Optical Character Recognition) software, thereby making the source code available in electronic form. See CRACKING DES, note 13, supra, which includes a printed version of scannable source code for controlling the DES Cracker hardware. BXA has maintained that the Administration continues to review whether and to what extent scannable printed code should be subject to the EAR and has reserved the option to impose export controls on such materials. But given the wide exportability of source code under the recent changes to the EAR, see § 3(c)(1), infra, these questions are largely moot.

FN42. For example, on April 12, 1994, BXA notified John Gilmore that since the "bones" version of Kerberos software "is available to the public without charge over the Internet, it is eligible for export under General License GTDA" [the predecessor to License Exception TSU]. ("Bones" is a stripped-down version of the MIT Kerberos network security software, with all the cryptographic code, and all the calls to cryptographic code, removed. See http://www.toad.com/gnu/export/kerberos-bones.commerce.2nd.ans).

FN43. Memorandum and Executive Order 13026, Section 4, note 22, supra. See also, EAR § 742.15 and ECCN 5D002. BXA re-stated this position most recently in a letter to Phil Karn's attorney dated November 17, 1997; see note 114, infra.

FN44. See § 4(a), infra.

FN45. Supplement No. 2 to Part 774.

FN46. These definitions are found at Part 772.

FN47. 22 CFR 120.9(1) and see generally Rubinstein, note 21, supra.

FN48. Category XIII(b)(1), 22 CFR 121.1 (1996).

FN49. Currently, this consists of Cuba, Iran, Iraq, Libya, North Korea, and Sudan. (Sudan is the only country with an X in the box, but the remainder is subject to other controls referenced in the chart and EAR Part 746.) Syria is listed in AT Column 1 but not 2 so may receive such exports under NLR.

FN50. EAR § 742.15(b)(1),

FN51. 56-bit products that use a key exchange mechanism with a key length greater than 512-bits, up to 1024-bits, are also eligible for preferential export treatment, but these products are exportable under an entirely separate provision of the EAR -- License Exception ENC. See § 3(b)(3)(ii), infra.

FN52. It is a well established interpretation that substantial support does not include telephone (voice only) help line services for installation or basic operation training provided by the supplier.

FN53. See the Cryptography Note in part 2 of Category 5 of the CCL.

FN54. EAR § 742.15(b)(1)(iii)(B).

FN55. See Appendix 1, infra.

FN56. See § 3(b)(1), supra.

FN57. EAR § 740.13(e)(1)

FN58. EAR § 740.13(e)(2)

FN59. EAR § 740.13(e)(1). The regulations go on to state that intellectual property protection will not, by itself, be construed as the kind of "express agreement for the payment of a licensing fee ..." referred to above.

FN60. There are embargoes on other countries administered by the Treasury Department's Office of Foreign Asset Controls (OFAC). 31 CFR Chapter V. But the only OFAC restriction on exports to Syria is related to financial transactions in which a U.S. person knows or has reasonable cause to believe that there is a risk of furthering terrorist acts in the United States. 31 CFR Part 596.

FN61. Retail encryption commodities, software and components are products:

(i) Generally available to the public by means of any of the following:

(A) sold in tangible form through retail outlets independent of the manufacturer;

(B) specifically designed for individual consumer use and sold or transferred through tangible or intangible means; or

(C) which are sold or will be sold in large volume without restriction through mail order transactions, electronic transactions, or telephone call transactions; and

(ii) Meeting all of the following:

(A) the cryptographic functionality cannot be easily changed by the user;

(B) substantial support is not required for installation and use;

(C) the cryptographic functionality has not been modified or customized to customer specification; and

(D) are not network infrastructure products such as high end routers or switches designed for large volume communications.

FN62. EAR § 740.17(b)(3)(iii).

FN63. EAR § 740.17(b)(3)(v).

FN64. EAR § 740.17(b)(3)(v).

FN65. EAR § 740.17(b)(3)(vi).

FN66. EAR § 734.17(b)(3)(iv).

FN67. EAR § 734.4(b).

FN68. "Cryptanalytic items" is defined in Part 772 of the EAR as "Systems, equipment, applications, specific electronic assemblies, modules and integrated circuits designed or modified to perform cryptanalytic functions, software having the characteristics of cryptanalytic hardware or performing cryptanalytic functions, or technology for the development, production or use of cryptanalytic commodities or software."

FN69. These countries include: Austria, Australia, Belgium, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Japan, Luxembourg, Netherlands, New Zealand, Norway, Poland, Portugal, Spain, Sweden, Switzerland, and the United Kingdom.

FN70. EAR § 740.17(a).

FN71. EAR § 740.17(d)(1)(i).

FN72. Part 772 of the EAR defines "U.S. subsidiary," as applied to encryption items, as:

(a) A foreign branch of a U.S. company; or

(b) A foreign subsidiary or entity of a U.S. entity in which:

The U.S. entity beneficially owns or controls (whether directly or indirectly) 25 percent or more of the voting securities of the foreign subsidiary or entity, if no other persons owns or controls (whether directly or indirectly) an equal or larger percentage; or
The foreign entity is operated by the U.S. entity pursuant to the provisions of an exclusive management contract; or
A majority of the members of the board of directors of the foreign subsidiary or entity also are members of the comparable governing body of the U.S. entity; or
The U.S. entity has the authority to appoint the majority of the members of the board of directors of the foreign subsidiary or entity; or
The U.S. entity has the authority to appoint the chief operating officer of the foreign subsidiary or entity.
FN73. EAR § 740.17(b)(1). However, any subsequent reexport to another foreign end-user that is not a subsidiary of a U.S. company would require prior review and classification by BXA, as would any sale or retransfer by the U.S. subsidiary of any items produced or developed using the U.S.-origin encryption item(s).

FN74. It is also worth noting that transfers of controlled encryption items to U.S. subsidiaries for internal company use are exempt from ENC reporting requirements. See EAR § 740.17(e)(1)(i).

FN75. EAR § 740.17(b)(2)(i).

FN76. EAR § 740.17(d)(1)(ii).

FN77. EAR § 740.17(b)(2)(ii).

FN78. For a general critique of the Administration's key recovery policy, see "The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption," A Report by an Ad Hoc Group of Cryptographers and Computer Scientists, Center for Democracy and Technology (May 27, 1997); the June 1998 edition of this report is available on-line at http://www.cdt.org/crypto/risks98/. The report argues that law enforcement requirements for key recovery are almost wholly incompatible with those of commercial encryption users (e.g., requirements for access without notice to the user; ubiquitous international adoption of key recovery; around-the-clock access to plaintext; and recovery of communications traffic as well as stored data); and that "key recovery systems are inherently less secure, more costly, and more difficult to use than similar systems without a key recovery feature." It should be noted that despite the large number of hardware and software vendors that participated in the Key Recovery Alliance, very few companies have ever received export approval for products that fully comply with the technical requirements of License Exception KMI.

FN79. BXA has posted on its Web page instructions concerning "Applying for an Encryption Licensing Arrangement"; see http:// www.bxa.doc.gov/Encryption/ela.htm. Note that EAR Section 750.7(c)(2) allows an exporter to request changes to the country scope of ELAs more informally by submitting a letter rather than filing a new ELA.

FN80. EAR § 740.17(b)(2)(i). See also § 3(b)(3)(ii), supra.

FN81. EAR § 740.17(b)(2)(ii). See also § 3(b)(3)(ii), supra.

FN82. Depending on the destination and other factors, a Treasury Department Office of Foreign Assets Controls (OFAC) license may be required, rather than a BXA license.

FN83. See § 3(c)(3) for a discussion of the deemed export rule.

FN84. EAR § 740.13(e).

FN85. EAR § 740.17(b)(4)(i).

FN86. EAR § 740.17(e)(3). Exporters are required to report biannually the names and addresses of the manufacturers to which the source code was directly transferred or sold. Plus, when the foreign products using the source code are made available for commercial sale, the exporter must submit a non-proprietary technical description of the product.

FN87. EAR § 740.17(b)(4)(ii). See § 4(b) for a discussion of open cryptographic interfaces.

FN88. EAR § 740.17(a).

FN89. EAR § 740.17(b)(4)(ii).

FN90. Following the submission of the completed classification request, immediate export to non-government end-users is permitted by EAR § 740.17(b)(4)(ii), and immediate export to any end-user in the EU+8 is permitted under EAR § 740.17(d)(1)(i).

FN91. EAR § 740.17(c).

FN92. See § 3(c)(2) for a more detailed discussion of the rules applicable to Internet distribution of encryption software.

FN93. Screening out government end-users outside of the EU+8 countries would be required only if the commercial source code were posted to an access- restricted site. If the code were posted to a public website, it would become "publicly available" commercial source code, and therefore would not require any screening.

FN94. BXA publishes a Table of Denial Orders, which is a list of individuals and entities that have been denied export privileges as a result of violating the EAA or the EAR. EAR § 764.6,127.1(a). Virtually all transactions with such persons are prohibited.

FN95. These guidelines were never formalized nor did DTC issue any Advisory Opinions or licenses approving any particular method of secure downloading. However, if a vendor submitted a written description of the download site, DTC would respond with a letter indicating whether it had any objections to the proposed safeguards.

FN96. EAR § 734.2(b)(9)(ii).

FN97. EAR § 734.2(b)(9)(iii).

FN98. 65 Fed. Reg. 2492 (January 14, 2000) (emphasis added).

FN99. See EAR § 779.1(b)(1)(ii) as amended by 59 Fed. Reg. 13449 (Mar. 22, 1994); now § 734.2(b)(2)(ii). Section 734.2(b)(3) defines "release" to include visual inspection by foreign nationals of U.S.-origin equipment and facilities; oral exchange of information in the U.S. or abroad; and the application to situations abroad of personal knowledge or technical experience acquired in the U.S.

FN100. 8 USC 1324b(a)(3).

FN101. An alien who seeks entry to the U.S. to engage in activities that violate or evade export control laws is excludable, while an alien who has engaged, is engaged, or at any time after entry engages in such activities is deportable. For a detailed (but now dated) discussion see generally Rubinstein, "Export Controls and Immigration Law," 93-3 Immigration Briefings (Mar. 1993).

FN102. See § 4(a), infra.

FN103. Even prior to the January 2000 changes to the EAR, the application of the deemed export rule to encryption technology was extremely limited. This is because the Public Domain EI-Software Carve-Out applies to encryption software only and not to encryption technology. Thus, any publicly available encryption technology is not subject to EI controls. In other words, even before January 2000, there were no restrictions on a foreign national employee's access to ECCN 5E002 technology if the technology was in the public domain.

FN104. EAR § 740.17(b)(1). This language was originally added in the January 14, 2000 regulations, but it referred to "foreign employees in the U.S." 65 Fed. Reg. 2497 (January 14, 2000). It was subsequently amended to refer to "foreign nationals in the United States" so that companies could disclose controlled encryption technology to persons who may not be permanent employees (e.g. contractors, interns, etc.). Both of these changes were made to address a hole in the regulations created by the license exception treatment given to exports of encryption items to foreign subsidiaries of U.S. companies. These companies found themselves in the position of being able to supply controlled encryption technology to foreign nationals if those persons were located in an overseas subsidiary or facility, but being prohibited from disclosing the technology if those same persons were located in the United Sates!

FN105. BXA has published a fact sheet with information on deemed export license applications, which is available in .pdf format at http://www.bxa.doc.gov/DeemedExports/foreignationals.pdf.

FN106. EFF is a non-profit organization dedicated to protecting civil liberties in cyberspace.

FN107. A group mainly composed of hackers and software engineers interested in promoting the use of strong cryptography. See generally Levy, "The Cypherpunks vs. Uncle Sam," New York Times Magazine, June 12, 1994 at 44.

FN108. "Civilizing the Electronic Frontier," Infosecurity News, 16 (March/April 1994).

FN109. Karn v. U.S. Department of State, 925 F. Supp. 1 (D. D.C. Mar. 22, 1996).

FN110. Bernstein v. Department of State, 922 F. Supp. 1426 (N.D. Cal. 1996) (Bernstein I).

FN111. See note 4, supra.

FN112. See "Words for Exports--but Not Electrons," Washington Post, (Oct. 15, 1994), quoting Dave Banisar of the Electronic Privacy Information Center.

FN113. See "A Tale of Two Crypto Cases," Information Law Alert, (May 3, 1996) quoting Lee Tien, an attorney in the Bernstein case.

FN114. The BXA documents as well as Karn's amended complaint are available on-line at http://people.qualcomm.com/karn/export/history.html.

FN115. Bernstein v. Department of State, 945 F. Supp. 1279 (N.D. Cal. 1996) (Bernstein II).

FN116. Bernstein v. Department of State, 974 F.Supp. 1288, (N.D. Cal. 1997) (Bernstein III).

FN117. But, see Junger v. Daley, No. 1:96-CV-1723 (N.D. Ohio July 2, 1998), http://samsara.law.cwru.edu/comp_law/jvd/pdj11.html, (holding that encryption software code doesn't warrant the same constitutional protection as other speech and granting summary judgment dismissing a suit challenging regulations that forbid the publication of encryption programs on the Internet). The Ohio court explicitly rejected Judge Patel's holding that source code is speech, saying: "The Bernstein court's assertion that 'language equals protected speech' is unsound. 'Speech' is not protected simply because we write it in a language." Rather, in agreement with BXA's position, the court viewed source code as a purely functional device: "The court in Bernstein misunderstood the significance of source code's functionality. Source code is 'purely functional,' in a way that the Bernstein Court's examples of instructions, manuals, and recipes are not. Unlike instructions, a manual, or a recipe, source code actually performs the function it describes. While a recipe provides instructions to a cook, source code is a device, like embedded circuitry in a telephone, that actually does the function of encryption."

FN118. See "Injunction Blocked in Crypto Case," News.com (Aug. 29, 1997) (quoting an anonymous source), http://www.news.com/News/Item/0,4,13856,00.html. The court then entered a second "Stay Pending Appeal Order" blocking the government from enforcing the regulations declared unconstitutional under the earlier decision; available on-line at http://www.eff.org/pub/Privacy/ITAR_export/Bernstein_case/Legal/970909_stay_order.images/.

FN119. See "Landmark Crypto Appeal Begins," News.com (Dec. 8, 1997) http://www.news.com/News/Item/0,4,17114,00.html.

FN120. Bernstein v. U.S. Dept. of Justice, 176 F.3d 1132 (9th Cir. 1999).

FN121. Bernstein v. U.S. Dept. of Justice, 192 F.3d 1308 (9th Cir. 1999).

FN122. The O'Brien test requires that a regulation be within the constitutional power of the government, that it further an "important and substantial government interest," and that the regulation be narrowly tailored to the government interest.

FN123. DTC's position seemed to be that "crypto with a hole" was subject to ITAR jurisdiction under Category XIII(b)(1), which covered "cryptographic systems, equipment, assemblies, modules, integrated circuits, components or software with the capability of maintaining secrecy or confidentiality." In addition, Category XIII(b)(5) applied to "[a]ncillary equipment specifically designed or modified for paragraphs (b)(1)...of this category." Although DTC never issued a formal public decision on "crypto with a hole," it apparently viewed this language as encompassing software specifically designed or modified to operate with or facilitate the operation of a cryptographic system, even if the software per se lacked encryption functionality.

FN124. Executive Order 13026, Section 1, note 22, supra.

FN125. While it may be feasible to access and modify an undocumented or closed cryptographic interface by decompiling a program's source code, carefully analyzing its functions, and then re-linking the product's code with a new security feature or algorithm, the work effort of making such modifications is generally greater than the effort of implementing a separate encryption program in the first place.

FN126. More information on Microsoft's CryptoAPI technology is available at http://www.microsoft.com/technet/security/cryptech.asp.

FN127. For example, a German company called Brokat Systeme offered a Java- based encryption toolkit for secure Internet banking and shopping which, according to the Brokat Web page, implements "highly secure 128-bit transaction encryption despite U.S. export restrictions." For more on Brokat, see "U.S. Restrictions Give European Encryption a Boost," Cybertimes (April 7, 1997). And a British company called UK Web marketed its 128-bit add-on product by noting that: "Safe Passage 1.1 is a product designed to let export-crippled browsers such as Netscape Navigator and Microsoft Internet Explorer use full 128-bit (or greater) encryption when talking to secure Web servers. It does this by acting as a full-strength, encrypting Web proxy that transparently intercedes between your browser and the Web, much like a proxy server. If your browser does not support full encryption, client authentication certificates, or smart cards, Safe Passage can provide these features for you, free from export crippling." UK Web also sold the 128-bit Stronghold web server, which was the second most popular commercial Web server for UNIX. For more on Stronghold, see "Politics for the Really Cool," Forbes, 172-79 (September 8, 1997).

FN128. The SSL Protocol Specification is detailed at http://home.netscape.com/eng/ssl3/ssl-toc.html (SSL 3).

FN129. See, e.g., ftp://ftp.psy.uq.oz.au/Crypto/.

FN130. Another example of foreign development of a cryptographic add-on for U.S. products became well known in May 1997 when Sun Microsystems announced that a Russian firm (Elvis+) had developed security software that would allow non-U.S. customers to use 128-bit encryption with Sun products. According to a Sun press release (http://www.skip.org/press-elvis.html) and published accounts, the Elvis+ add-on product uses a security protocol called SKIP, which was developed by Sun, although Sun reportedly provided no technical assistance to Elvis+ in developing its encryption software. See "Sun Dodges Crypto Export Limits," News.com, May 19, 1997 (http://www.news.com/News/Item/0,4,10778,00.html); "Sun Microsystems' Sale of Encryption Software Tests U.S. Law," Cybertimes, May 20, 1997. Sun planned to market the Elvis+ product in the U.S., while Elvis+ would handle distribution to overseas customers. In August 1997, however, Sun announced that the ship date had slipped and that it was cooperating with a Commerce Department review. See "Elvis Crypto Has Not Left the Building," News.com, August 18, 1997 (http://www.news.com/News/Item/0,4,13483,00.html). Assuming that Elvis+ relied on publicly available encryption algorithms, publicly available security protocols, and non-U.S. encryption source code; that the Elvis+ product did not commingle more than a de minimis amount of U.S.-origin software; and that Sun did not provide any prohibited technical assistance to Elvis+, it is not clear how BXA could assert EAR jurisdiction over Sun's marketing of a product developed overseas. Neither Sun nor BXA have publicly commented on any final resolution of the matter.

FN131. In a letter to Peter Junger's attorney (see note 117, supra, for more on the Junger case), BXA has stated that publishing a Web page containing links to foreign sites that contain encryption programs is not subject to the EAR per § 734.2(b). See http://samsara.law.cwru.edu/comp_law/jvd/pdj-bxa-gjs070397.html.

FN132. Unpublished State Department document entitled "USML Rationalization Exercise"; copy in authors' possession.

FN133. Canada, for example, does not carve out encryption software from the mass market exemption. Canadian companies have taken advantage of this difference between U.S. and Canadian export law to market 128-bit products on a worldwide basis, much to the surprise of BXA. See, e.g., "Canadian Product Puts New Spin on Encryption Debate," Cybertimes (Aug. 1, 1997).

FN134. White House, Office of the Press Secretary, "Statement by the Press Secretary," (April 16, 1993). See also Federal Information Processing Standards Publication 185, Escrowed Encryption Standard (EES), 59 Fed. Reg. 5997 (1994) (FIPS 185).

FN135. Skipjack uses 80-bit keys, compared with 56-bit keys for DES, and 40- bit keys for exportable versions of RC2 and RC4.

FN136. For an exhaustive treatment of the Clipper controversy, see Froomkin, "The Metaphor is The Key: Cryptography, the Clipper Chip, and the Constitution," 143 U. Penn. L. Rev. 709-897 (1995); see also "Report of a Special Panel of the ACM U.S. Public Policy Committee," Association for Computing Machinery (June 1994); for a more general treatment of national cryptography policy, see U.S. Congress, Office of Technology Assessment, INFORMATION SECURITY AND PRIVACY IN NETWORK ENVIRONMENTS (1994) (OTA-TCT-606); and the more recent and comprehensive NRC study, note 4, supra.

FN137. Available on-line at http://www.cdt.org/crypto/goreltr.html.

FN138. For details of the September 5-6, 1996, NIST meetings, see generally http://csrc.nist.gov/keyrecovery/september_issues_mtg/.

FN139. Available on-line at http://csrc.nist.gov/keyrecovery/criteria.txt.

FN140. For background on Pro-CODE and the text of S. 1726, see http://www.cdt.org/crypto/legis_104/pro_CODE.html. On February 27, 1997, Senator Burns re-introduced the Pro-CODE bill as S. 377, available on-line at http://www.cdt.org/crypto/legis_105/pro_CODE/index.shtml . On March 19, 1997, however, the Senate Commerce Committee approved the Secure Public Networks bill, which was substituted for Pro-CODE over the objections of Sen. Burns. See text accompanying note 162, infra.

FN141. "Enabling Privacy, Commerce, Security and Public Safety in the Global Information Infrastructure." Available on-line at http://www.cdt.org/crypto/clipper_III/clipper_III_draft.html. For a detailed analysis of this white paper, see Froomkin, "It Came From Planet Clipper: The Battle Over Cryptographic Key Escrow'," 1996 U.Ch.L. Forum 15, 50-60.

FN142. The October 1 announcement is available on-line at http://www.epic.org/crypto/key_escrow/clipper4_statement.html. The Memorandum and Executive Order 13026, 61 Fed. Reg. 58767 (November 15, 1996) are available on-line at http://www.bxa.doc.gov/Encryption/m961115.htm and http://www.bxa.doc.gov/Encryption/eo13026.htm, respectively.

FN143. 61 Fed. Reg. 68572 (Dec. 30, 1996); available on-line at http://www.bxa.doc.gov/Regulations/encreg.pdf.

FN144. The CCL lists all items that are subject to licensing under the EAR. CCL entries are identified by an Export Control Classification Numbers (ECCN). The introduction to Part 738 of the EAR explains the structure of the CCL and how to understand ECCNs.

FN145. 15 CFR Parts 730-774 as amended by 61 Fed. Reg. 12714 (Mar 25, 1996). The legislative authority for the EAR is the Export Administration Act of 1979, Pub. L. No. 96-72, 93 Stat. 503 (codified as amended at 50 USC app. § § 2401-2420) (hereinafter EAA). The CCL is found at 15 CFR Part 774, Supplement No. 1. The EAA has lapsed several times, most recently on Sept. 30, 1990. When the EAA lapses, its provisions are maintained by Executive Order under the authority of International Emergency Economic Powers Act, Pub. L. No. 95-223, Title II, § 201, 92, Stat. 1626 (codified at 50 USC § § 1701- 1706). See Executive Order 12924 of August 19, 1994 (59 Fed. Reg. 43437), notice of August 15, 1995 (60 Fed. Reg. 42767), notice of August 14, 1996 (60 Fed. Reg. 42527), and notice of August 13, 1998 (63 Fed.Reg. 44121).

FN146. See EAR, Supplement No. 2 to Part 774.

FN147. Christensen, note 38, supra, at 385.

FN148. CoCom established the General Software Note multilaterally and the U.S. implemented it in 1990. 55 Fed. Reg. 26655, at 26658 (then Adv. Note 5 to Supp. 3 to EAR Part 799) (Jun. 29, 1990); extended as the General Software Note in the CoCom Core List implementation, 56 Fed. Reg. 42824, 42861-62 (Aug. 29, 1991). The U.S. first proposed creation of a mass-market exclusion from controls in a proposed rewrite of controls on technical data and software published on October 13, 1988. 53 Fed. Reg. 40074.

FN149. In addition to excluding mass-market encryption software from License Exception TSU, it appears that the U.S. government pursued efforts to persuade other members of the Wassenaar Arrangement (the successor to CoCom) to restrict eligibility of all encryption software under the General Software Note. On September 1, 1998, the Global Internet Liberty Campaign issued a letter calling for the removal of cryptography export restrictions from the Wassenaar Arrangement List of Dual-Use Goods and Technologies; see http://www.gilc.org/crypto/wassenaar/gilc-statement-998.html. A copy of this list is available on-line at http://www.jya.com/wa/watoc.htm.

FN150. See § 2(d), supra.

FN151. All of the following requirements must be met:

The data encryption algorithm must be RC2 and/or RC4 with a key space of no more than 40 bits.
If both RC4 and RC2 are used in the same software, their functionality must be separate (i.e., no data can be operated on by both algorithms).
The software must not allow the alteration of the data encryption or key management mechanisms (and their associated key spaces) by the user or any other program.
The key exchange used in data encryption must be: (a) a public key algorithm that does not exceed a 512 bit modulus for the key space and/or (b) a symmetric algorithm (including but not limited to RC2, DES, and proprietary algorithms) with a key space not exceeding 64 bits.
FN152. The ITAR regime also allowed CJ transfer under the 15-day procedure of certain products that either met the 40-bit standard but were not strictly speaking "mass market" products or that deviated in some way from the 40-bit standard, either by using a different algorithm (such as DES) or a longer key space. Section 742.15(b) of the 1996 regulations made some effort at accommodating past DTC licensing practice by allowing 15-day processing of "company proprietary" software but this phrase did not fully capture all of the software products that DTC and NSA previously approved for transfer to the EAR. Moreover, Supplement 6 to Part 742 only covered mass market software, which wrongly suggested that company proprietary software (whether narrowly or broadly understood) was not eligible for 15-day processing (it was).

FN153. The full text of H.R. 695 is available on-line at http://www.cdt.org/crypto/legis_105/SAFE/hr695_text.html; this bill is identical to H.R. 3011, which Cong. Goodlatte introduced during the 104th Congress.

FN154. The following summary is derived from the analysis provided by the Center for Democracy and Technology at http://www.cdt.org/crypto/legis_105/SAFE/.

FN155. The amendment is available on-line at http://www.cdt.org/crypto/legis_105/SAFE/970722_amd_Gilman.html.

FN156. The amendment is available on-line at http://www.cdt.org/crypto/legis_105/SAFE/970909_amd.html.

FN157. See http://www.cdt.org/crypto/fbi_draft_text.html.

FN158. See http://www.cdt.org/crypto/legis_105/SAFE/Markey_White.html.

FN159. See http://www.cdt.org/crypto/legis_105/SAFE/Oxley_Manton.html.

FN160. A letter from 63 organizations is available on-line at http://www.cdt.org/crypto/legis_105/SAFE/970922_OxlMan.html.

FN161. See, e.g., "House Panel Rejects FBI Plan on Encryption," Cybertimes, (Sept. 25, 1997).

FN162. The bill is available on-line at http://www.cdt.org/crypto/legis_105/mccain_kerrey/billtext.html.

FN163. Frist's amendments are available on-line at http://www.cdt.org/crypto/legis_105/mccain_kerrey/Frist_amends.html. Senator Kerrey's position seems to have shifted, however. In a floor speech delivered on October 12, 1998, he said:

Personal privacy in the digital world should not suffer at the hand of unreasonable export laws. Therefore, Congress should take action in the coming year to remove export restrictions on encryption products of any strength. I am confident that through cooperation between government and industry, encryption can be exported without compromising the legitimate needs of law enforcement and national security. A compromise can be crafted if all parties, both private and public, are willing to work together to solve the common goal of maintaining America's national security in the new digital world. We should create in law a panel consisting of members of Congress, Administration officials, and leaders in high-technology industries to address the implications of information technology on our society and our security. We should also create a new national laboratory for information technology that will both perform research in this field and serve as a forum for further discussions of the issues arising from information technology.

FN164. The bill is available on-line at http://www.cdt.org/crypto/legis_105/eprivacy/.

FN165. See, e.g., the discussion of "informal noncodified exemptions" in the CRISIS report at 117-121, note 4, supra.

FN166. Category XIII(b)(1)(ii); 22 CFR 121.1.

FN167. ECCN 5A002, note h (although as published this note inadvertently omitted language regarding "interbanking transactions"); the Financial Regulations corrected this omission.

FN168. See note 142, supra.

FN169. See "Encryption Exports Approved for Electronic Commerce," (May 8, 1997) available on-line at http://www.bxa.doc.gov/press/97/banks2.htm. Shortly thereafter, the Administration approved export licenses or ELAs for 128-bit software in on-line financial services, presumably on the grounds that banks are already highly regulated and thus should be able to export and receive exports of any strength encryption under an Encryption Licensing Arrangement, and without any need for key recovery.

FN170. 15 Fed. Reg. 50516 (Sept. 16, 1998), available on-line in .pdf format at the BXA's Commercial Encryption Export Controls Web page, http://www.bxa.doc.gov/Encryption/Default.htm (hereinafter, the "Financial Regulations"); see Appendix 1, infra.

FN171. Financial Regulations, § 742.15(b)(1)(i).

FN172. The text of the White House Press Release on the updating of the Administration's Encryption Policy is available on-line at, and the text of the White House Briefing on the Administration's Encryption Policy at

FN173. The regulations defined a "U.S. subsidiary" as follows:

(a) A foreign branch of a U.S. company; or

(b) A foreign subsidiary or entity of a U.S. entity in which:

The U.S. entity beneficially owns or controls (whether directly or indirectly) 25 percent or more of the voting securities of the foreign subsidiary or entity, if no other persons owns or controls (whether directly or indirectly) an equal or larger percentage; or
The foreign entity is operated by the U.S. entity pursuant to the provisions of an exclusive management contract; or
A majority of the members of the board of directors of the foreign subsidiary or entity also are members of the comparable governing body of the U.S. entity; or
The U.S. entity has the authority to appoint the majority of the members of the board of directors of the foreign subsidiary or entity; or
The U.S. entity has the authority to appoint the chief operating officer of the foreign subsidiary or entity.
FN174. 63 Fed. Reg. 72166 (December 31, 1998).

FN175. Id. at 72161.

FN176. Vendors of mass-market software were disappointed that the regulations placed 56-bit mass-market software products under License Exception ENC, rather than being released from EI controls and made them eligible for export under License Exception TSU pursuant to the General Software Note. The practical differences include the following: First, License Exception TSU does not require semi-annual reporting of quantity and ECCNs for non-Wassenaar countries, and most mass market vendors are in no position to collect this data. Second, release from EI controls would make 56-bit mass-market products eligible for various benefits under the EAR including public domain treatment and the de minimis rule.

FN177. "Recoverable commodities and software" was defined in the regulations as either:

(a) A stored data product containing a recovery feature that, when activated, allows recovery of the plaintext of encrypted data without the assistance of the end-user; or

(b) A product or system designed such that a network administrator or other authorized persons who are removed from the end-user can provide law enforcement access to plaintext without the knowledge or assistance of the end-user. This includes, for example, products or systems where plaintext exists and is accessible at intermediate points in a network or infrastructure system, enterprise-controlled recovery systems, and products which permit recovery of plaintext at the server where a system administrator controls or can provide recovery of plaintext across an enterprise.

63 Fed. Reg. 72166 (December 31, 1998).

FN178. Austria, Australia, Belgium, Canada, Denmark, Finland, France, Germany, Iceland, Ireland, Italy, Japan, Luxembourg, The Netherlands, New Zealand, Norway, Portugal, Spain, Sweden, Switzerland, and the United Kingdom.

FN179. BXA quickly began approving ELAs under this policy for a number of vendors including WebTV, see Web TV Press Release (Oct. 5, 1998), and a coalition of 10 high tech companies including Ascend Communications, Cisco Systems, 3Com, Hewlett-Packard, Network Associates, Nortel Networks, Novell, Red Creek Communications, Secure Computing Corporation, and Sun Microsystems, see Alliance for Network Security Press Release (Oct. 19, 1998).

FN180. 65 Fed. Reg. 2492 (January 14, 2000).

FN181. 65 Fed. Reg. 62609 (October 19, 2000).

FN182. EAR § 740.17(a).

FN183. EAR § 740.17(e)(vii) & (ix).

FN184. EAR § 740.17(e)(viii).